Dec 16 14:54:34 crc systemd[1]: Starting Kubernetes Kubelet... Dec 16 14:54:34 crc restorecon[4706]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:54:34 crc restorecon[4706]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:54:34 crc restorecon[4706]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 16 14:54:35 crc kubenswrapper[4775]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 14:54:35 crc kubenswrapper[4775]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 16 14:54:35 crc kubenswrapper[4775]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 14:54:35 crc kubenswrapper[4775]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 14:54:35 crc kubenswrapper[4775]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 16 14:54:35 crc kubenswrapper[4775]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.181161 4775 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184832 4775 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184872 4775 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184879 4775 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184913 4775 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184919 4775 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184924 4775 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184929 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184933 4775 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184936 4775 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184940 4775 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184944 4775 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184948 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184951 4775 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184956 4775 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184959 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184963 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184968 4775 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184973 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184979 4775 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184985 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.184990 4775 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185005 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185010 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185014 4775 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185018 4775 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185023 4775 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185026 4775 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185030 4775 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185035 4775 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185038 4775 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185042 4775 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185046 4775 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185051 4775 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185055 4775 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185058 4775 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185063 4775 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185069 4775 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185073 4775 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185077 4775 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185081 4775 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185085 4775 feature_gate.go:330] unrecognized feature gate: Example Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185089 4775 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185093 4775 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185096 4775 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185100 4775 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185104 4775 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185108 4775 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185113 4775 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185117 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185121 4775 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185126 4775 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185129 4775 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185134 4775 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185139 4775 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185143 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185147 4775 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185150 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185159 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185163 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185168 4775 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185171 4775 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185175 4775 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185178 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185181 4775 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185185 4775 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185189 4775 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185192 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185195 4775 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185199 4775 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185202 4775 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.185206 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185310 4775 flags.go:64] FLAG: --address="0.0.0.0" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185321 4775 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185332 4775 flags.go:64] FLAG: --anonymous-auth="true" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185338 4775 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185343 4775 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185348 4775 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185358 4775 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185363 4775 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185368 4775 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185372 4775 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185377 4775 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185403 4775 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185408 4775 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185414 4775 flags.go:64] FLAG: --cgroup-root="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185419 4775 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185423 4775 flags.go:64] FLAG: --client-ca-file="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185427 4775 flags.go:64] FLAG: --cloud-config="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185431 4775 flags.go:64] FLAG: --cloud-provider="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185435 4775 flags.go:64] FLAG: --cluster-dns="[]" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185450 4775 flags.go:64] FLAG: --cluster-domain="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185454 4775 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185458 4775 flags.go:64] FLAG: --config-dir="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185475 4775 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185479 4775 flags.go:64] FLAG: --container-log-max-files="5" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185486 4775 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185490 4775 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185495 4775 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185499 4775 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185504 4775 flags.go:64] FLAG: --contention-profiling="false" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185508 4775 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185512 4775 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185517 4775 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185520 4775 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185526 4775 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185530 4775 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185534 4775 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185539 4775 flags.go:64] FLAG: --enable-load-reader="false" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185544 4775 flags.go:64] FLAG: --enable-server="true" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185548 4775 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185556 4775 flags.go:64] FLAG: --event-burst="100" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185561 4775 flags.go:64] FLAG: --event-qps="50" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185565 4775 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185570 4775 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185575 4775 flags.go:64] FLAG: --eviction-hard="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185582 4775 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185589 4775 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185595 4775 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185600 4775 flags.go:64] FLAG: --eviction-soft="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185605 4775 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185611 4775 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185615 4775 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185621 4775 flags.go:64] FLAG: --experimental-mounter-path="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185626 4775 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185632 4775 flags.go:64] FLAG: --fail-swap-on="true" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185637 4775 flags.go:64] FLAG: --feature-gates="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185643 4775 flags.go:64] FLAG: --file-check-frequency="20s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185648 4775 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185653 4775 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185667 4775 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185672 4775 flags.go:64] FLAG: --healthz-port="10248" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185678 4775 flags.go:64] FLAG: --help="false" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185683 4775 flags.go:64] FLAG: --hostname-override="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185688 4775 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185693 4775 flags.go:64] FLAG: --http-check-frequency="20s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185698 4775 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185703 4775 flags.go:64] FLAG: --image-credential-provider-config="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185708 4775 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185713 4775 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185718 4775 flags.go:64] FLAG: --image-service-endpoint="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185723 4775 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185728 4775 flags.go:64] FLAG: --kube-api-burst="100" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185733 4775 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185737 4775 flags.go:64] FLAG: --kube-api-qps="50" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185741 4775 flags.go:64] FLAG: --kube-reserved="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185746 4775 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185750 4775 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185754 4775 flags.go:64] FLAG: --kubelet-cgroups="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185760 4775 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185764 4775 flags.go:64] FLAG: --lock-file="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185768 4775 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185772 4775 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185777 4775 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185784 4775 flags.go:64] FLAG: --log-json-split-stream="false" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185788 4775 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185792 4775 flags.go:64] FLAG: --log-text-split-stream="false" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185798 4775 flags.go:64] FLAG: --logging-format="text" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185802 4775 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185807 4775 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185811 4775 flags.go:64] FLAG: --manifest-url="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185815 4775 flags.go:64] FLAG: --manifest-url-header="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185822 4775 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185827 4775 flags.go:64] FLAG: --max-open-files="1000000" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185832 4775 flags.go:64] FLAG: --max-pods="110" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185837 4775 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185849 4775 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185854 4775 flags.go:64] FLAG: --memory-manager-policy="None" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185858 4775 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185862 4775 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185867 4775 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185871 4775 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185883 4775 flags.go:64] FLAG: --node-status-max-images="50" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185903 4775 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185907 4775 flags.go:64] FLAG: --oom-score-adj="-999" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185911 4775 flags.go:64] FLAG: --pod-cidr="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185916 4775 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185923 4775 flags.go:64] FLAG: --pod-manifest-path="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185927 4775 flags.go:64] FLAG: --pod-max-pids="-1" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185931 4775 flags.go:64] FLAG: --pods-per-core="0" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185935 4775 flags.go:64] FLAG: --port="10250" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185940 4775 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185945 4775 flags.go:64] FLAG: --provider-id="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185949 4775 flags.go:64] FLAG: --qos-reserved="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185953 4775 flags.go:64] FLAG: --read-only-port="10255" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185958 4775 flags.go:64] FLAG: --register-node="true" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185962 4775 flags.go:64] FLAG: --register-schedulable="true" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185967 4775 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185975 4775 flags.go:64] FLAG: --registry-burst="10" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.185979 4775 flags.go:64] FLAG: --registry-qps="5" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186024 4775 flags.go:64] FLAG: --reserved-cpus="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186030 4775 flags.go:64] FLAG: --reserved-memory="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186035 4775 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186039 4775 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186043 4775 flags.go:64] FLAG: --rotate-certificates="false" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186047 4775 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186052 4775 flags.go:64] FLAG: --runonce="false" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186057 4775 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186062 4775 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186068 4775 flags.go:64] FLAG: --seccomp-default="false" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186073 4775 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186078 4775 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186094 4775 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186100 4775 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186105 4775 flags.go:64] FLAG: --storage-driver-password="root" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186110 4775 flags.go:64] FLAG: --storage-driver-secure="false" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186114 4775 flags.go:64] FLAG: --storage-driver-table="stats" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186119 4775 flags.go:64] FLAG: --storage-driver-user="root" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186124 4775 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186129 4775 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186133 4775 flags.go:64] FLAG: --system-cgroups="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186138 4775 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186146 4775 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186150 4775 flags.go:64] FLAG: --tls-cert-file="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186156 4775 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186166 4775 flags.go:64] FLAG: --tls-min-version="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186171 4775 flags.go:64] FLAG: --tls-private-key-file="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186176 4775 flags.go:64] FLAG: --topology-manager-policy="none" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186181 4775 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186186 4775 flags.go:64] FLAG: --topology-manager-scope="container" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186190 4775 flags.go:64] FLAG: --v="2" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186199 4775 flags.go:64] FLAG: --version="false" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186215 4775 flags.go:64] FLAG: --vmodule="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186221 4775 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186226 4775 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186385 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186394 4775 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186399 4775 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186403 4775 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186408 4775 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186412 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186427 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186433 4775 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186439 4775 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186443 4775 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186448 4775 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186453 4775 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186458 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186472 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186477 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186480 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186485 4775 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186489 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186493 4775 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186497 4775 feature_gate.go:330] unrecognized feature gate: Example Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186502 4775 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186507 4775 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186512 4775 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186517 4775 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186521 4775 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186526 4775 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186530 4775 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186534 4775 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186538 4775 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186547 4775 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186553 4775 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186557 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186562 4775 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186566 4775 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186570 4775 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186575 4775 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186579 4775 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186583 4775 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186587 4775 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186592 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186596 4775 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186600 4775 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186604 4775 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186608 4775 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186613 4775 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186617 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186621 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186625 4775 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186630 4775 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186637 4775 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186641 4775 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186645 4775 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186650 4775 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186662 4775 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186668 4775 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186673 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186678 4775 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186682 4775 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186688 4775 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186694 4775 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186699 4775 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186707 4775 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186712 4775 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186717 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186721 4775 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186726 4775 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186730 4775 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186735 4775 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186739 4775 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186744 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.186750 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.186770 4775 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.198139 4775 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.198195 4775 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198288 4775 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198297 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198302 4775 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198307 4775 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198313 4775 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198317 4775 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198321 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198325 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198329 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198333 4775 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198337 4775 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198341 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198345 4775 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198349 4775 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198352 4775 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198357 4775 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198362 4775 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198367 4775 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198371 4775 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198375 4775 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198379 4775 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198383 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198387 4775 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198391 4775 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198395 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198399 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198404 4775 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198410 4775 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198414 4775 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198418 4775 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198422 4775 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198425 4775 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198429 4775 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198433 4775 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198437 4775 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198440 4775 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198444 4775 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198448 4775 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198452 4775 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198456 4775 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198459 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198464 4775 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198471 4775 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198475 4775 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198479 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198483 4775 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198487 4775 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198491 4775 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198494 4775 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198498 4775 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198503 4775 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198508 4775 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198512 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198516 4775 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198520 4775 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198525 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198528 4775 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198533 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198536 4775 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198540 4775 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198543 4775 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198547 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198551 4775 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198554 4775 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198558 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198562 4775 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198565 4775 feature_gate.go:330] unrecognized feature gate: Example Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198569 4775 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198573 4775 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198576 4775 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198581 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.198588 4775 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198710 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198719 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198723 4775 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198726 4775 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198730 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198734 4775 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198737 4775 feature_gate.go:330] unrecognized feature gate: Example Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198740 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198744 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198748 4775 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198754 4775 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198758 4775 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198761 4775 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198765 4775 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198769 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198772 4775 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198776 4775 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198780 4775 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198785 4775 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198789 4775 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198793 4775 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198797 4775 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198801 4775 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198805 4775 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198810 4775 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198814 4775 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198819 4775 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198823 4775 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198827 4775 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198832 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198836 4775 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198839 4775 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198843 4775 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198846 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198850 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198853 4775 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198856 4775 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198861 4775 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198864 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198868 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198871 4775 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198875 4775 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198878 4775 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198881 4775 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198904 4775 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198907 4775 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198911 4775 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198914 4775 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198918 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198924 4775 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198928 4775 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198933 4775 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198937 4775 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198941 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198945 4775 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198949 4775 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198954 4775 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198960 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198965 4775 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198969 4775 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198973 4775 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198977 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198981 4775 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198985 4775 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198988 4775 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198992 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198995 4775 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.198999 4775 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.199002 4775 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.199006 4775 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.199010 4775 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.199016 4775 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.199748 4775 server.go:940] "Client rotation is on, will bootstrap in background" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.202577 4775 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.202663 4775 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.203142 4775 server.go:997] "Starting client certificate rotation" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.203167 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.203485 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-13 07:33:43.837731871 +0000 UTC Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.203617 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.208300 4775 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 16 14:54:35 crc kubenswrapper[4775]: E1216 14:54:35.210003 4775 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.210015 4775 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.222566 4775 log.go:25] "Validated CRI v1 runtime API" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.236525 4775 log.go:25] "Validated CRI v1 image API" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.238199 4775 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.240842 4775 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-16-14-50-21-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.240923 4775 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.264276 4775 manager.go:217] Machine: {Timestamp:2025-12-16 14:54:35.262549651 +0000 UTC m=+0.213628614 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:1c1c08a3-d604-4a9e-b8da-c0df5af4d40b BootID:4dbd1130-4ad9-49a4-81ac-e33bda81b192 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:bf:f6:24 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:bf:f6:24 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:e2:46:bc Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:f3:94:71 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:bb:51:fa Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ca:70:1a Speed:-1 Mtu:1496} {Name:eth10 MacAddress:92:f1:2e:8e:fb:37 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:96:aa:55:df:03:79 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.264643 4775 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.264874 4775 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.265406 4775 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.265675 4775 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.265729 4775 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.266086 4775 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.266101 4775 container_manager_linux.go:303] "Creating device plugin manager" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.266366 4775 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.266418 4775 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.266727 4775 state_mem.go:36] "Initialized new in-memory state store" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.266957 4775 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.267781 4775 kubelet.go:418] "Attempting to sync node with API server" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.267811 4775 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.267849 4775 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.267880 4775 kubelet.go:324] "Adding apiserver pod source" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.267917 4775 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.269406 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Dec 16 14:54:35 crc kubenswrapper[4775]: E1216 14:54:35.269520 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.269409 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Dec 16 14:54:35 crc kubenswrapper[4775]: E1216 14:54:35.269573 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.270054 4775 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.270477 4775 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.271394 4775 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.272048 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.272109 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.272122 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.272133 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.272150 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.272162 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.272172 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.272190 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.272203 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.272215 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.272245 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.272259 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.272509 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.273067 4775 server.go:1280] "Started kubelet" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.273193 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.273466 4775 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.273472 4775 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.274258 4775 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 14:54:35 crc systemd[1]: Started Kubernetes Kubelet. Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.275540 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.275592 4775 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 14:54:35 crc kubenswrapper[4775]: E1216 14:54:35.275440 4775 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1881b9da189f234c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-16 14:54:35.27303662 +0000 UTC m=+0.224115553,LastTimestamp:2025-12-16 14:54:35.27303662 +0000 UTC m=+0.224115553,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.275840 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 15:16:14.597240355 +0000 UTC Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.275911 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 72h21m39.321332126s for next certificate rotation Dec 16 14:54:35 crc kubenswrapper[4775]: E1216 14:54:35.275910 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.276202 4775 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.276236 4775 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.276255 4775 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.276222 4775 server.go:460] "Adding debug handlers to kubelet server" Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.276559 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Dec 16 14:54:35 crc kubenswrapper[4775]: E1216 14:54:35.276618 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:54:35 crc kubenswrapper[4775]: E1216 14:54:35.277699 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="200ms" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.280188 4775 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.280224 4775 factory.go:55] Registering systemd factory Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.280240 4775 factory.go:221] Registration of the systemd container factory successfully Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.282149 4775 factory.go:153] Registering CRI-O factory Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.282273 4775 factory.go:221] Registration of the crio container factory successfully Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.282346 4775 factory.go:103] Registering Raw factory Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.282408 4775 manager.go:1196] Started watching for new ooms in manager Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.283828 4775 manager.go:319] Starting recovery of all containers Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295268 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295331 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295345 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295354 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295364 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295374 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295384 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295395 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295409 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295422 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295433 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295442 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295453 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295464 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295479 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295487 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295498 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295512 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295524 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295537 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295552 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295566 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295578 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295590 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295605 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295618 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295631 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295642 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295655 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295668 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295678 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295688 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295699 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295709 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295722 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295733 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295743 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295754 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295764 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295775 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295786 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295797 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295807 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295818 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295828 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.295837 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296468 4775 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296502 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296515 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296530 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296542 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296555 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296568 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296584 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296595 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296606 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296617 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296630 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296640 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296651 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296667 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296677 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296688 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296698 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296708 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296721 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296734 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296745 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296756 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296766 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296778 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296788 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296799 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296811 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296822 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296832 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296843 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296854 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296867 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296880 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296906 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296919 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296933 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296955 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296964 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296982 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.296995 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297009 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297019 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297033 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297046 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297059 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297073 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297088 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297099 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297110 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297121 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297132 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297143 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297156 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297167 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297178 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297189 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297200 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297210 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297226 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297241 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297251 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297263 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297274 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297285 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297294 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297304 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297315 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297325 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297336 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297346 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297355 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297366 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297377 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297386 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297396 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297406 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297416 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297425 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297434 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297443 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297455 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297467 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297480 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297493 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297504 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297517 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297533 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297543 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297555 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297567 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297578 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297593 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297609 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297622 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297638 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297652 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297665 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297680 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297698 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297711 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297759 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297780 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297793 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297806 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297818 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297830 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297842 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297856 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297869 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297898 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297912 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297927 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297940 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297953 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297969 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297984 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.297997 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298008 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298024 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298040 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298054 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298069 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298084 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298097 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298109 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298123 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298139 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298151 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298161 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298172 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298184 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298196 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298209 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298220 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298231 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298243 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298254 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298265 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298279 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298290 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298302 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298314 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298326 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298337 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298347 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298357 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298368 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298378 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298389 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298399 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298408 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298417 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298428 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298438 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298448 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298457 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298467 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298476 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298485 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298496 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298505 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298514 4775 reconstruct.go:97] "Volume reconstruction finished" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.298521 4775 reconciler.go:26] "Reconciler: start to sync state" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.303496 4775 manager.go:324] Recovery completed Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.313192 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.314995 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.315050 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.315064 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.316057 4775 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.316077 4775 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.316098 4775 state_mem.go:36] "Initialized new in-memory state store" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.325746 4775 policy_none.go:49] "None policy: Start" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.331364 4775 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.331576 4775 state_mem.go:35] "Initializing new in-memory state store" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.331517 4775 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.336456 4775 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.336535 4775 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.336579 4775 kubelet.go:2335] "Starting kubelet main sync loop" Dec 16 14:54:35 crc kubenswrapper[4775]: E1216 14:54:35.336650 4775 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 14:54:35 crc kubenswrapper[4775]: W1216 14:54:35.338158 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Dec 16 14:54:35 crc kubenswrapper[4775]: E1216 14:54:35.338274 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:54:35 crc kubenswrapper[4775]: E1216 14:54:35.483708 4775 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 16 14:54:35 crc kubenswrapper[4775]: E1216 14:54:35.483777 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 16 14:54:35 crc kubenswrapper[4775]: E1216 14:54:35.483866 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="400ms" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.525332 4775 manager.go:334] "Starting Device Plugin manager" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.525388 4775 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.525402 4775 server.go:79] "Starting device plugin registration server" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.525926 4775 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.525942 4775 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.526206 4775 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.527495 4775 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.527516 4775 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 14:54:35 crc kubenswrapper[4775]: E1216 14:54:35.536342 4775 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.626327 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.628407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.628439 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.628448 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.628470 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 14:54:35 crc kubenswrapper[4775]: E1216 14:54:35.628877 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.684299 4775 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.684408 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.686052 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.686085 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.686094 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.686201 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.686352 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.686416 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.686999 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.687032 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.687048 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.687205 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.687324 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.687359 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.687987 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.688005 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.688013 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.688086 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.688129 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.688139 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.688299 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.688315 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.688322 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.688402 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.688559 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.688607 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.688950 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.688967 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.688975 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.689050 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.689198 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.689222 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.689631 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.689657 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.689652 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.689697 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.689709 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.689667 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.689805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.689828 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.689839 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.690243 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.690330 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.691533 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.691573 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.691585 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.786490 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.786556 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.786584 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.786613 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.786693 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.786810 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.787290 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.789611 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.789752 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.789903 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.790018 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.790949 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.791021 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.791086 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.791122 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.830018 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.831602 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.831672 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.831691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.831736 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 14:54:35 crc kubenswrapper[4775]: E1216 14:54:35.832421 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Dec 16 14:54:35 crc kubenswrapper[4775]: E1216 14:54:35.885747 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="800ms" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893000 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893091 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893142 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893187 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893229 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893246 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893328 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893353 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893262 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893416 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893493 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893512 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893576 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893653 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893693 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893717 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893745 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893767 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893786 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893799 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893818 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893861 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893930 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893970 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.893996 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.894021 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.894036 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.894083 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.894102 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:54:35 crc kubenswrapper[4775]: I1216 14:54:35.894168 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:54:36 crc kubenswrapper[4775]: I1216 14:54:36.010430 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 14:54:36 crc kubenswrapper[4775]: I1216 14:54:36.030503 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 16 14:54:36 crc kubenswrapper[4775]: W1216 14:54:36.035637 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-effb2aa6998df4dd1465f049fbe3ec71d07e757c9d75d4856052b2d8336aeeb7 WatchSource:0}: Error finding container effb2aa6998df4dd1465f049fbe3ec71d07e757c9d75d4856052b2d8336aeeb7: Status 404 returned error can't find the container with id effb2aa6998df4dd1465f049fbe3ec71d07e757c9d75d4856052b2d8336aeeb7 Dec 16 14:54:36 crc kubenswrapper[4775]: I1216 14:54:36.054315 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:54:36 crc kubenswrapper[4775]: W1216 14:54:36.057262 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-e6f7c2a5f93e4f5cf0df5d910c55875b02be4edaf62a163b4c5b9bbcfb0e684b WatchSource:0}: Error finding container e6f7c2a5f93e4f5cf0df5d910c55875b02be4edaf62a163b4c5b9bbcfb0e684b: Status 404 returned error can't find the container with id e6f7c2a5f93e4f5cf0df5d910c55875b02be4edaf62a163b4c5b9bbcfb0e684b Dec 16 14:54:36 crc kubenswrapper[4775]: W1216 14:54:36.068466 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-e8500353598dd7f61732104c66e942fe1068a84e33dcc8d3c5c2536a3a2db24a WatchSource:0}: Error finding container e8500353598dd7f61732104c66e942fe1068a84e33dcc8d3c5c2536a3a2db24a: Status 404 returned error can't find the container with id e8500353598dd7f61732104c66e942fe1068a84e33dcc8d3c5c2536a3a2db24a Dec 16 14:54:36 crc kubenswrapper[4775]: I1216 14:54:36.071814 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:54:36 crc kubenswrapper[4775]: I1216 14:54:36.076573 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:54:36 crc kubenswrapper[4775]: W1216 14:54:36.094155 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-90d24efe9c3453ec6513a10da48026bf02538507e791f57d5d7cd63c313d7b9b WatchSource:0}: Error finding container 90d24efe9c3453ec6513a10da48026bf02538507e791f57d5d7cd63c313d7b9b: Status 404 returned error can't find the container with id 90d24efe9c3453ec6513a10da48026bf02538507e791f57d5d7cd63c313d7b9b Dec 16 14:54:36 crc kubenswrapper[4775]: I1216 14:54:36.233534 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:36 crc kubenswrapper[4775]: I1216 14:54:36.235096 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:36 crc kubenswrapper[4775]: I1216 14:54:36.235146 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:36 crc kubenswrapper[4775]: I1216 14:54:36.235162 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:36 crc kubenswrapper[4775]: I1216 14:54:36.235197 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 14:54:36 crc kubenswrapper[4775]: E1216 14:54:36.235850 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Dec 16 14:54:36 crc kubenswrapper[4775]: I1216 14:54:36.274862 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Dec 16 14:54:36 crc kubenswrapper[4775]: I1216 14:54:36.342018 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"90d24efe9c3453ec6513a10da48026bf02538507e791f57d5d7cd63c313d7b9b"} Dec 16 14:54:36 crc kubenswrapper[4775]: I1216 14:54:36.343200 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e444daded50ab33c3544c7e914c2024bca67f09cc9e438a4e0b6e195d9016090"} Dec 16 14:54:36 crc kubenswrapper[4775]: I1216 14:54:36.344338 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e8500353598dd7f61732104c66e942fe1068a84e33dcc8d3c5c2536a3a2db24a"} Dec 16 14:54:36 crc kubenswrapper[4775]: I1216 14:54:36.345395 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e6f7c2a5f93e4f5cf0df5d910c55875b02be4edaf62a163b4c5b9bbcfb0e684b"} Dec 16 14:54:36 crc kubenswrapper[4775]: I1216 14:54:36.346166 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"effb2aa6998df4dd1465f049fbe3ec71d07e757c9d75d4856052b2d8336aeeb7"} Dec 16 14:54:36 crc kubenswrapper[4775]: W1216 14:54:36.357398 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Dec 16 14:54:36 crc kubenswrapper[4775]: E1216 14:54:36.357521 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:54:36 crc kubenswrapper[4775]: W1216 14:54:36.409380 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Dec 16 14:54:36 crc kubenswrapper[4775]: E1216 14:54:36.409459 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:54:36 crc kubenswrapper[4775]: W1216 14:54:36.675843 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Dec 16 14:54:36 crc kubenswrapper[4775]: E1216 14:54:36.676021 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:54:36 crc kubenswrapper[4775]: E1216 14:54:36.692600 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="1.6s" Dec 16 14:54:36 crc kubenswrapper[4775]: W1216 14:54:36.908621 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Dec 16 14:54:36 crc kubenswrapper[4775]: E1216 14:54:36.908708 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.036500 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.038421 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.038462 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.038473 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.038496 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 14:54:37 crc kubenswrapper[4775]: E1216 14:54:37.038957 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.274414 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.350605 4775 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3" exitCode=0 Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.350688 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3"} Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.351251 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.352590 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.352644 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.352661 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.352846 4775 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318" exitCode=0 Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.353003 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318"} Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.353219 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.354388 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.354536 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.354679 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.357955 4775 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255" exitCode=0 Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.358195 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.358744 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255"} Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.360169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.360207 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.360221 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.370786 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9"} Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.370831 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f"} Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.370846 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8"} Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.370859 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a"} Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.370982 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.372002 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.372035 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.372045 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.373871 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a" exitCode=0 Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.373927 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a"} Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.374135 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.379824 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.379858 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.379871 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.383217 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.383785 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.383966 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.383990 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:37 crc kubenswrapper[4775]: I1216 14:54:37.384000 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:37 crc kubenswrapper[4775]: E1216 14:54:37.385041 4775 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.378745 4775 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f" exitCode=0 Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.378872 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f"} Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.378963 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.380317 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.380353 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.380368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.381874 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5303ea6c3b5cbada36b01c138cf3db28f57fa8d5974b2e35179aef3ee62e4ae1"} Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.381960 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.383519 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.383603 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.383622 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.389144 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"17fa2414d74d950bfd3e9631cdf0da6bc8b58f406d485d086d084d305ad5d466"} Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.389202 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"dbdad84c13b928859836825f69d08d47815805b625941bb708e4057dfe754d2c"} Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.389224 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.389225 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"131fe40257ce003285c74c2cc7160316851ec72690dd09901ec8b16468e0d107"} Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.390320 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.390350 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.390364 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.392780 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.392757 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e"} Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.392928 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e"} Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.392942 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138"} Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.392952 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a"} Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.393413 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.393453 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.393466 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.505771 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.640098 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.641908 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.641955 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.641972 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:38 crc kubenswrapper[4775]: I1216 14:54:38.641992 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.252645 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.397874 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6"} Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.398023 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.399474 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.399538 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.399562 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.402051 4775 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a" exitCode=0 Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.402189 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a"} Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.402300 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.402339 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.402406 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.402978 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.403753 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.403773 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.403781 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.404376 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.404399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.404408 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.404411 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.404437 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.404449 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.404465 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.404502 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:39 crc kubenswrapper[4775]: I1216 14:54:39.404526 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:40 crc kubenswrapper[4775]: I1216 14:54:40.410281 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87"} Dec 16 14:54:40 crc kubenswrapper[4775]: I1216 14:54:40.410339 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f"} Dec 16 14:54:40 crc kubenswrapper[4775]: I1216 14:54:40.410357 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74"} Dec 16 14:54:40 crc kubenswrapper[4775]: I1216 14:54:40.410368 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc"} Dec 16 14:54:40 crc kubenswrapper[4775]: I1216 14:54:40.410375 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:40 crc kubenswrapper[4775]: I1216 14:54:40.410413 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 14:54:40 crc kubenswrapper[4775]: I1216 14:54:40.410453 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:40 crc kubenswrapper[4775]: I1216 14:54:40.411394 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:40 crc kubenswrapper[4775]: I1216 14:54:40.411407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:40 crc kubenswrapper[4775]: I1216 14:54:40.411430 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:40 crc kubenswrapper[4775]: I1216 14:54:40.411442 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:40 crc kubenswrapper[4775]: I1216 14:54:40.411442 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:40 crc kubenswrapper[4775]: I1216 14:54:40.411545 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:41 crc kubenswrapper[4775]: I1216 14:54:41.417589 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f"} Dec 16 14:54:41 crc kubenswrapper[4775]: I1216 14:54:41.417880 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:41 crc kubenswrapper[4775]: I1216 14:54:41.420302 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:41 crc kubenswrapper[4775]: I1216 14:54:41.420407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:41 crc kubenswrapper[4775]: I1216 14:54:41.420427 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:41 crc kubenswrapper[4775]: I1216 14:54:41.440032 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 16 14:54:41 crc kubenswrapper[4775]: I1216 14:54:41.451759 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:54:41 crc kubenswrapper[4775]: I1216 14:54:41.452034 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 14:54:41 crc kubenswrapper[4775]: I1216 14:54:41.452094 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:41 crc kubenswrapper[4775]: I1216 14:54:41.453582 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:41 crc kubenswrapper[4775]: I1216 14:54:41.453625 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:41 crc kubenswrapper[4775]: I1216 14:54:41.453637 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:42 crc kubenswrapper[4775]: I1216 14:54:42.227279 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:54:42 crc kubenswrapper[4775]: I1216 14:54:42.253704 4775 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 14:54:42 crc kubenswrapper[4775]: I1216 14:54:42.253815 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 14:54:42 crc kubenswrapper[4775]: I1216 14:54:42.420269 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:42 crc kubenswrapper[4775]: I1216 14:54:42.420429 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:42 crc kubenswrapper[4775]: I1216 14:54:42.421280 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:42 crc kubenswrapper[4775]: I1216 14:54:42.421315 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:42 crc kubenswrapper[4775]: I1216 14:54:42.421326 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:42 crc kubenswrapper[4775]: I1216 14:54:42.421357 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:42 crc kubenswrapper[4775]: I1216 14:54:42.421374 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:42 crc kubenswrapper[4775]: I1216 14:54:42.421384 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:42 crc kubenswrapper[4775]: I1216 14:54:42.598703 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:54:43 crc kubenswrapper[4775]: I1216 14:54:43.423240 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:43 crc kubenswrapper[4775]: I1216 14:54:43.424221 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:43 crc kubenswrapper[4775]: I1216 14:54:43.424261 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:43 crc kubenswrapper[4775]: I1216 14:54:43.424272 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:43 crc kubenswrapper[4775]: I1216 14:54:43.448800 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:54:43 crc kubenswrapper[4775]: I1216 14:54:43.449066 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:43 crc kubenswrapper[4775]: I1216 14:54:43.450377 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:43 crc kubenswrapper[4775]: I1216 14:54:43.450414 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:43 crc kubenswrapper[4775]: I1216 14:54:43.450427 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:43 crc kubenswrapper[4775]: I1216 14:54:43.456274 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:54:43 crc kubenswrapper[4775]: I1216 14:54:43.675045 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:54:44 crc kubenswrapper[4775]: I1216 14:54:44.000603 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 16 14:54:44 crc kubenswrapper[4775]: I1216 14:54:44.000849 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:44 crc kubenswrapper[4775]: I1216 14:54:44.002347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:44 crc kubenswrapper[4775]: I1216 14:54:44.002405 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:44 crc kubenswrapper[4775]: I1216 14:54:44.002423 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:44 crc kubenswrapper[4775]: I1216 14:54:44.425433 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:44 crc kubenswrapper[4775]: I1216 14:54:44.426729 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:44 crc kubenswrapper[4775]: I1216 14:54:44.426784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:44 crc kubenswrapper[4775]: I1216 14:54:44.426800 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:44 crc kubenswrapper[4775]: I1216 14:54:44.663485 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:54:45 crc kubenswrapper[4775]: I1216 14:54:45.427526 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:45 crc kubenswrapper[4775]: I1216 14:54:45.428663 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:45 crc kubenswrapper[4775]: I1216 14:54:45.428711 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:45 crc kubenswrapper[4775]: I1216 14:54:45.428727 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:45 crc kubenswrapper[4775]: E1216 14:54:45.536954 4775 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 16 14:54:46 crc kubenswrapper[4775]: I1216 14:54:46.431043 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:46 crc kubenswrapper[4775]: I1216 14:54:46.432963 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:46 crc kubenswrapper[4775]: I1216 14:54:46.433054 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:46 crc kubenswrapper[4775]: I1216 14:54:46.433066 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:46 crc kubenswrapper[4775]: I1216 14:54:46.435315 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:54:47 crc kubenswrapper[4775]: I1216 14:54:47.432927 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:47 crc kubenswrapper[4775]: I1216 14:54:47.434007 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:47 crc kubenswrapper[4775]: I1216 14:54:47.434073 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:47 crc kubenswrapper[4775]: I1216 14:54:47.434087 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:48 crc kubenswrapper[4775]: I1216 14:54:48.041548 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 16 14:54:48 crc kubenswrapper[4775]: I1216 14:54:48.041775 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:48 crc kubenswrapper[4775]: I1216 14:54:48.042812 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:48 crc kubenswrapper[4775]: I1216 14:54:48.042854 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:48 crc kubenswrapper[4775]: I1216 14:54:48.042865 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:48 crc kubenswrapper[4775]: I1216 14:54:48.275121 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 16 14:54:48 crc kubenswrapper[4775]: E1216 14:54:48.302480 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 16 14:54:48 crc kubenswrapper[4775]: W1216 14:54:48.403427 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 16 14:54:48 crc kubenswrapper[4775]: I1216 14:54:48.403584 4775 trace.go:236] Trace[141322344]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 14:54:38.401) (total time: 10001ms): Dec 16 14:54:48 crc kubenswrapper[4775]: Trace[141322344]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (14:54:48.403) Dec 16 14:54:48 crc kubenswrapper[4775]: Trace[141322344]: [10.001700603s] [10.001700603s] END Dec 16 14:54:48 crc kubenswrapper[4775]: E1216 14:54:48.403618 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 16 14:54:48 crc kubenswrapper[4775]: W1216 14:54:48.544228 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 16 14:54:48 crc kubenswrapper[4775]: I1216 14:54:48.544326 4775 trace.go:236] Trace[1759166769]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 14:54:38.542) (total time: 10001ms): Dec 16 14:54:48 crc kubenswrapper[4775]: Trace[1759166769]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (14:54:48.544) Dec 16 14:54:48 crc kubenswrapper[4775]: Trace[1759166769]: [10.001517837s] [10.001517837s] END Dec 16 14:54:48 crc kubenswrapper[4775]: E1216 14:54:48.544352 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 16 14:54:48 crc kubenswrapper[4775]: I1216 14:54:48.651971 4775 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 16 14:54:48 crc kubenswrapper[4775]: I1216 14:54:48.652047 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 16 14:54:48 crc kubenswrapper[4775]: I1216 14:54:48.657954 4775 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 16 14:54:48 crc kubenswrapper[4775]: I1216 14:54:48.658034 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 16 14:54:51 crc kubenswrapper[4775]: I1216 14:54:51.458094 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:54:51 crc kubenswrapper[4775]: I1216 14:54:51.458295 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:51 crc kubenswrapper[4775]: I1216 14:54:51.458899 4775 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 16 14:54:51 crc kubenswrapper[4775]: I1216 14:54:51.458956 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 16 14:54:51 crc kubenswrapper[4775]: I1216 14:54:51.459675 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:51 crc kubenswrapper[4775]: I1216 14:54:51.459739 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:51 crc kubenswrapper[4775]: I1216 14:54:51.459759 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:51 crc kubenswrapper[4775]: I1216 14:54:51.463427 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:54:51 crc kubenswrapper[4775]: I1216 14:54:51.898065 4775 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 16 14:54:51 crc kubenswrapper[4775]: I1216 14:54:51.898194 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 16 14:54:52 crc kubenswrapper[4775]: I1216 14:54:52.228807 4775 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 16 14:54:52 crc kubenswrapper[4775]: I1216 14:54:52.228939 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 16 14:54:52 crc kubenswrapper[4775]: I1216 14:54:52.253923 4775 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 14:54:52 crc kubenswrapper[4775]: I1216 14:54:52.254018 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 14:54:52 crc kubenswrapper[4775]: I1216 14:54:52.446216 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:52 crc kubenswrapper[4775]: I1216 14:54:52.446803 4775 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 16 14:54:52 crc kubenswrapper[4775]: I1216 14:54:52.446877 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 16 14:54:52 crc kubenswrapper[4775]: I1216 14:54:52.447787 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:52 crc kubenswrapper[4775]: I1216 14:54:52.447835 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:52 crc kubenswrapper[4775]: I1216 14:54:52.447846 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:53 crc kubenswrapper[4775]: I1216 14:54:53.651916 4775 trace.go:236] Trace[1995651132]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 14:54:39.912) (total time: 13739ms): Dec 16 14:54:53 crc kubenswrapper[4775]: Trace[1995651132]: ---"Objects listed" error: 13739ms (14:54:53.651) Dec 16 14:54:53 crc kubenswrapper[4775]: Trace[1995651132]: [13.73927739s] [13.73927739s] END Dec 16 14:54:53 crc kubenswrapper[4775]: I1216 14:54:53.651957 4775 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 16 14:54:53 crc kubenswrapper[4775]: I1216 14:54:53.654354 4775 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 16 14:54:53 crc kubenswrapper[4775]: I1216 14:54:53.654417 4775 trace.go:236] Trace[277950195]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 14:54:39.543) (total time: 14111ms): Dec 16 14:54:53 crc kubenswrapper[4775]: Trace[277950195]: ---"Objects listed" error: 14111ms (14:54:53.654) Dec 16 14:54:53 crc kubenswrapper[4775]: Trace[277950195]: [14.111227147s] [14.111227147s] END Dec 16 14:54:53 crc kubenswrapper[4775]: I1216 14:54:53.654452 4775 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 16 14:54:53 crc kubenswrapper[4775]: E1216 14:54:53.654458 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 16 14:54:53 crc kubenswrapper[4775]: I1216 14:54:53.659530 4775 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 16 14:54:53 crc kubenswrapper[4775]: I1216 14:54:53.985815 4775 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.180383 4775 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.279172 4775 apiserver.go:52] "Watching apiserver" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.281541 4775 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.281815 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.282553 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.284321 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.284456 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.284483 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 14:54:54 crc kubenswrapper[4775]: E1216 14:54:54.284656 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.284960 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:54:54 crc kubenswrapper[4775]: E1216 14:54:54.285198 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.285347 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:54:54 crc kubenswrapper[4775]: E1216 14:54:54.285487 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.288216 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.288383 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.288588 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.288794 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.289725 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.289733 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.290498 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.290516 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.291112 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.321555 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.334452 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.351572 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.363555 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.377483 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.377653 4775 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.389204 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.400284 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.451847 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.454314 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6" exitCode=255 Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.454358 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6"} Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459193 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459239 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459264 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459292 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459318 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459342 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459368 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459393 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459438 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459466 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459507 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459548 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459570 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459636 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459658 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459679 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459701 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459725 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459702 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459776 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459899 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459925 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459931 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.459996 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460040 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460073 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460258 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460296 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460321 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460349 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460384 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460419 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460451 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460482 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460518 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460546 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460570 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460599 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460617 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460671 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460699 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460722 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460749 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460773 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460853 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460904 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460958 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460982 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461008 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461035 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461064 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461089 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461116 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461142 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461165 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461195 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461220 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461245 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461271 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461298 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461326 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461361 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461386 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461412 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461440 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461501 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461534 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461600 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461625 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461650 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461675 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461699 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461730 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461798 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461830 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461874 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461914 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461940 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461966 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461993 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462024 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462056 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462086 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462115 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462142 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462166 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462190 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462217 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462245 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462269 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462296 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462320 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462346 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462374 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462400 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462424 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462452 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462481 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462505 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462526 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462549 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462574 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462600 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462626 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462651 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462677 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462704 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462730 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462755 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461040 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462779 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462805 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461098 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462828 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462857 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462901 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462928 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462950 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462952 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462974 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.463010 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.463036 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461146 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461163 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461163 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461229 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461324 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.460215 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461701 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461738 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461740 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461806 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461946 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.463253 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.461825 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462024 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462070 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462437 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462621 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462752 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.462764 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.464971 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.465293 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.465400 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.465307 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.465904 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.466160 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.466462 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.466621 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.466642 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.467031 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.467058 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.467077 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.467260 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.467381 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.467643 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.467655 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.467792 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.467803 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.463068 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.468051 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.468125 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.468162 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.468373 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.468595 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.468703 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469131 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469177 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469381 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469397 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469486 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469535 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469563 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469586 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469612 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469632 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469653 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469674 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469816 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469827 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469845 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469869 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469922 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469942 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469984 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.470008 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.470030 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.470068 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.470093 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.470112 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.470167 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.470202 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.470261 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.470305 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.470327 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.470347 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.471982 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.472078 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.472548 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.472596 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.472646 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.472667 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.472845 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.472880 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.473001 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.473028 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.473071 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.473094 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.473133 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.473163 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.473221 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.473247 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.473273 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.473317 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.473339 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.473363 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.474109 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.474139 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.474248 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.474278 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.474357 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.474381 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.474404 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.469976 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.470109 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.470150 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.470384 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.470767 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.471609 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.471615 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.472277 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.472338 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.470638 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.472900 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.472935 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.473144 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.473362 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.473666 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.473851 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.473874 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.473469 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: E1216 14:54:54.474453 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:54:54.974431192 +0000 UTC m=+19.925510105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.475683 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476184 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476093 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476215 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476257 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476297 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476327 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476355 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476369 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476380 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476573 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476594 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476611 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476641 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476659 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476677 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476696 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476716 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476737 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476759 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476779 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476796 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476812 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477041 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477058 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477081 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477101 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477160 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477190 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477218 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477239 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477262 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477285 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477307 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477328 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477349 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477371 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477395 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477414 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477432 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477450 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477552 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477569 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477581 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477591 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477602 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477613 4775 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477624 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477635 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477646 4775 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477660 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477628 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477671 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477860 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477881 4775 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477916 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477933 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477951 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477968 4775 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477990 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.478008 4775 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.478029 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.478043 4775 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.478058 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.478076 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.478092 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.478108 4775 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.478124 4775 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.478147 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.479351 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.476714 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477108 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477338 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477538 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477339 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477618 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477644 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.475615 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.477927 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.479167 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.479236 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.479370 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.479380 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.479432 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.479523 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.479570 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.479737 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.479901 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.479919 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.480103 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.480508 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.480927 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.481198 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.481325 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.481316 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.484551 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.481648 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.481680 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.481821 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.481827 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.482068 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.482135 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.482605 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.482662 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.482931 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.484014 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.484720 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.485242 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.485330 4775 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.485329 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.485559 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: E1216 14:54:54.485666 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:54:54 crc kubenswrapper[4775]: E1216 14:54:54.485734 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:54:54.985714849 +0000 UTC m=+19.936793762 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:54:54 crc kubenswrapper[4775]: E1216 14:54:54.486549 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.486671 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.486802 4775 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 16 14:54:54 crc kubenswrapper[4775]: E1216 14:54:54.487019 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:54:54.986928517 +0000 UTC m=+19.938007440 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.487133 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.490834 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.490951 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491103 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491140 4775 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491270 4775 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491379 4775 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491407 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491426 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491440 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491451 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491386 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491522 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491544 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491556 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491574 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491589 4775 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491604 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491617 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491613 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491636 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491703 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491677 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491857 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491900 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491924 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491940 4775 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491963 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491980 4775 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491994 4775 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.492011 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.492030 4775 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.492045 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.492062 4775 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.492078 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.492101 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.492117 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.492137 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.492154 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.492176 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.491981 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.492007 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.492054 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.494570 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.495550 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.495552 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.495905 4775 scope.go:117] "RemoveContainer" containerID="7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.496199 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.496304 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.496341 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.496607 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.497178 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.498129 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.500991 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.501092 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: E1216 14:54:54.501102 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:54:54 crc kubenswrapper[4775]: E1216 14:54:54.501160 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:54:54 crc kubenswrapper[4775]: E1216 14:54:54.501181 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:54:54 crc kubenswrapper[4775]: E1216 14:54:54.501351 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 14:54:55.001323602 +0000 UTC m=+19.952402525 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.501516 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.503400 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.503903 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: E1216 14:54:54.503980 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:54:54 crc kubenswrapper[4775]: E1216 14:54:54.504000 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:54:54 crc kubenswrapper[4775]: E1216 14:54:54.504011 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:54:54 crc kubenswrapper[4775]: E1216 14:54:54.504065 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 14:54:55.004052499 +0000 UTC m=+19.955131622 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.504283 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.505307 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.505394 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.505475 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.505845 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.505901 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.506062 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.509630 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.528258 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.528501 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.529269 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.529560 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.535021 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.535576 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.537475 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.538342 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.538542 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.538659 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.538723 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.542841 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.543409 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.543503 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.547333 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.547478 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.547607 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.548519 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.549080 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.549120 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.549172 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.549571 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.549602 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.549958 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.550227 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.550335 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.550357 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.551125 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.551656 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.551653 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.551675 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.551944 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.551993 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.552135 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.552169 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.552383 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.552493 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.553386 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.553961 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.557437 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.558921 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.559154 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.562421 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.564809 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.574644 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.575688 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.578771 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.579593 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.580289 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.586720 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.588287 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.588394 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.589142 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.589481 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.589537 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.589613 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.589816 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.591099 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.591644 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592116 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592583 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592645 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592706 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592720 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592735 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592750 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592764 4775 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592779 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592792 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592805 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592818 4775 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592832 4775 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592844 4775 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592858 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592872 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592883 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592913 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592921 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592880 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592931 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593000 4775 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593018 4775 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593035 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593050 4775 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593065 4775 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593080 4775 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593095 4775 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593109 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593122 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593137 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593151 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593166 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593179 4775 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593194 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593208 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593229 4775 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593243 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593256 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593269 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593282 4775 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593298 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593312 4775 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593325 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593338 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593351 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593365 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593381 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593393 4775 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593474 4775 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593491 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593505 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593519 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593533 4775 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593547 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593561 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593574 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593588 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593600 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593614 4775 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593627 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593644 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593659 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593671 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593687 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593702 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593715 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593727 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593740 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593752 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593765 4775 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593779 4775 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593792 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593806 4775 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593819 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593833 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593847 4775 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593860 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.593873 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594021 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594040 4775 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594054 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594068 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594081 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594094 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594108 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594123 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594136 4775 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594149 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594163 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594174 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594189 4775 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594212 4775 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594227 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594241 4775 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594255 4775 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594268 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594281 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594293 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594307 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594321 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594336 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594348 4775 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594362 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594374 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594386 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594399 4775 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594411 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594425 4775 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594438 4775 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594456 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594469 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594481 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594494 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594506 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594521 4775 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594534 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594548 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594561 4775 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594573 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594584 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594597 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594609 4775 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594622 4775 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594636 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594650 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594665 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594677 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.594691 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.592709 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.596078 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.596669 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.596841 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.603795 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.612138 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.612756 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.625334 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.626409 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:54:54 crc kubenswrapper[4775]: W1216 14:54:54.646452 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-d9235596c254a0d744f23472ba60cc060a5fad6f211298c9cc7c4263903809b9 WatchSource:0}: Error finding container d9235596c254a0d744f23472ba60cc060a5fad6f211298c9cc7c4263903809b9: Status 404 returned error can't find the container with id d9235596c254a0d744f23472ba60cc060a5fad6f211298c9cc7c4263903809b9 Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.695436 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.695733 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.695743 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.695754 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.999638 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.999756 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:54:54 crc kubenswrapper[4775]: I1216 14:54:54.999784 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:54:54 crc kubenswrapper[4775]: E1216 14:54:54.999875 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:54:54 crc kubenswrapper[4775]: E1216 14:54:54.999958 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:54:55.999941022 +0000 UTC m=+20.951019945 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:54:55 crc kubenswrapper[4775]: E1216 14:54:55.000338 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:54:55 crc kubenswrapper[4775]: E1216 14:54:55.000431 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:54:56.000422817 +0000 UTC m=+20.951501750 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:54:55 crc kubenswrapper[4775]: E1216 14:54:55.000563 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:54:56.000541191 +0000 UTC m=+20.951620124 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.101003 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.101046 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:54:55 crc kubenswrapper[4775]: E1216 14:54:55.101207 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:54:55 crc kubenswrapper[4775]: E1216 14:54:55.101229 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:54:55 crc kubenswrapper[4775]: E1216 14:54:55.101244 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:54:55 crc kubenswrapper[4775]: E1216 14:54:55.101257 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:54:55 crc kubenswrapper[4775]: E1216 14:54:55.101300 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 14:54:56.101279825 +0000 UTC m=+21.052358748 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:54:55 crc kubenswrapper[4775]: E1216 14:54:55.101304 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:54:55 crc kubenswrapper[4775]: E1216 14:54:55.101332 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:54:55 crc kubenswrapper[4775]: E1216 14:54:55.101414 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 14:54:56.101386669 +0000 UTC m=+21.052465632 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.337416 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:54:55 crc kubenswrapper[4775]: E1216 14:54:55.337578 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.342153 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.343022 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.343861 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.344641 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.345416 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.345994 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.346824 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.347538 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.348363 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.349060 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.349683 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.350523 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.351210 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.351880 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.354537 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.355151 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.355879 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.356277 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.356751 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.356942 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.357521 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.358040 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.358617 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.359071 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.359715 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.360124 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.360783 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.361491 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.364615 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.365475 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.366216 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.366783 4775 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.366926 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.368639 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.369490 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.370028 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.371461 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.372310 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.375079 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.375693 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.376739 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.377320 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.378367 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.379126 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.380261 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.380751 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.381826 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.382347 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.383494 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.384070 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.385074 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.385659 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.386377 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.387551 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.388295 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.396854 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.420206 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.443238 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.462520 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64"} Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.462634 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4970a78b942119962cba2f82c6c8fcc63325f93f5b677c67383623d810d87494"} Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.462992 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.465091 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.466597 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857"} Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.467364 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.467711 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d9235596c254a0d744f23472ba60cc060a5fad6f211298c9cc7c4263903809b9"} Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.469317 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3"} Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.469341 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690"} Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.469352 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2270db28f5d48814b92681622e12ed03ec6011e5b7726d15fb4e16e00b3e51ab"} Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.479783 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.493828 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.508608 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.528628 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.588820 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.613722 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.632357 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.651411 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:55 crc kubenswrapper[4775]: I1216 14:54:55.668600 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.010636 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.010795 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.010846 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.010908 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:54:58.010853425 +0000 UTC m=+22.961932378 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.010985 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.011056 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:54:58.011038901 +0000 UTC m=+22.962117864 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.011185 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.011473 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:54:58.011353931 +0000 UTC m=+22.962432894 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.111751 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.111920 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.112012 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.112060 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.112074 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.112136 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 14:54:58.112116136 +0000 UTC m=+23.063195059 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.112028 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.112198 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.112211 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.112286 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 14:54:58.11224604 +0000 UTC m=+23.063324983 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.336858 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.336939 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.337071 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.337207 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.855094 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.857700 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.857774 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.857785 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.857860 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.869563 4775 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.869989 4775 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.871665 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.871702 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.871717 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.871738 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.871748 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:56Z","lastTransitionTime":"2025-12-16T14:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.898364 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.906361 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.906407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.906420 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.906442 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.906455 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:56Z","lastTransitionTime":"2025-12-16T14:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.922412 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.927146 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.927190 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.927201 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.927218 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.927232 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:56Z","lastTransitionTime":"2025-12-16T14:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.940684 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.944990 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.945039 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.945053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.945069 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.945084 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:56Z","lastTransitionTime":"2025-12-16T14:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.956940 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.961196 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.961251 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.961265 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.961285 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.961296 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:56Z","lastTransitionTime":"2025-12-16T14:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.978111 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:56 crc kubenswrapper[4775]: E1216 14:54:56.978232 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.980577 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.980633 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.980647 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.980671 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:56 crc kubenswrapper[4775]: I1216 14:54:56.980685 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:56Z","lastTransitionTime":"2025-12-16T14:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.083562 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.083616 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.083626 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.083641 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.083651 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:57Z","lastTransitionTime":"2025-12-16T14:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.186565 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.186627 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.186635 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.186651 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.186683 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:57Z","lastTransitionTime":"2025-12-16T14:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.288448 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.288496 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.288510 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.288525 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.288534 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:57Z","lastTransitionTime":"2025-12-16T14:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.337654 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:54:57 crc kubenswrapper[4775]: E1216 14:54:57.337830 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.391207 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.391251 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.391262 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.391278 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.391287 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:57Z","lastTransitionTime":"2025-12-16T14:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.411854 4775 csr.go:261] certificate signing request csr-4lxwf is approved, waiting to be issued Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.428909 4775 csr.go:257] certificate signing request csr-4lxwf is issued Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.465660 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-f2p7z"] Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.466046 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f2p7z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.469704 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.470098 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.472336 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.476564 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d"} Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.493554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.493596 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.493605 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.493622 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.493632 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:57Z","lastTransitionTime":"2025-12-16T14:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.513221 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:57Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.524449 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tkgx\" (UniqueName: \"kubernetes.io/projected/f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5-kube-api-access-6tkgx\") pod \"node-resolver-f2p7z\" (UID: \"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\") " pod="openshift-dns/node-resolver-f2p7z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.524513 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5-hosts-file\") pod \"node-resolver-f2p7z\" (UID: \"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\") " pod="openshift-dns/node-resolver-f2p7z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.533983 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:57Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.558740 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:57Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.574389 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:57Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.595686 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.595731 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.595743 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.595760 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.595772 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:57Z","lastTransitionTime":"2025-12-16T14:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.597021 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:57Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.611688 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:57Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.625713 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5-hosts-file\") pod \"node-resolver-f2p7z\" (UID: \"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\") " pod="openshift-dns/node-resolver-f2p7z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.625782 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tkgx\" (UniqueName: \"kubernetes.io/projected/f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5-kube-api-access-6tkgx\") pod \"node-resolver-f2p7z\" (UID: \"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\") " pod="openshift-dns/node-resolver-f2p7z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.625867 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5-hosts-file\") pod \"node-resolver-f2p7z\" (UID: \"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\") " pod="openshift-dns/node-resolver-f2p7z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.627393 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:57Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.645329 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:57Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.646475 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tkgx\" (UniqueName: \"kubernetes.io/projected/f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5-kube-api-access-6tkgx\") pod \"node-resolver-f2p7z\" (UID: \"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\") " pod="openshift-dns/node-resolver-f2p7z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.669492 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:57Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.684902 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:57Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.696527 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:57Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.698623 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.698660 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.698669 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.698684 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.698695 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:57Z","lastTransitionTime":"2025-12-16T14:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.708248 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:57Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.719416 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:57Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.735882 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:57Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.749953 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:57Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.772489 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:57Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.777788 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f2p7z" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.801320 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.801639 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.801651 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.801666 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.801678 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:57Z","lastTransitionTime":"2025-12-16T14:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.903857 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.903941 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.903958 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.903982 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:57 crc kubenswrapper[4775]: I1216 14:54:57.904000 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:57Z","lastTransitionTime":"2025-12-16T14:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.006161 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.006196 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.006207 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.006224 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.006238 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:58Z","lastTransitionTime":"2025-12-16T14:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.028503 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.028605 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.028649 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:54:58 crc kubenswrapper[4775]: E1216 14:54:58.028802 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:54:58 crc kubenswrapper[4775]: E1216 14:54:58.028869 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:02.028847421 +0000 UTC m=+26.979926344 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:54:58 crc kubenswrapper[4775]: E1216 14:54:58.029709 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:55:02.029696497 +0000 UTC m=+26.980775440 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:54:58 crc kubenswrapper[4775]: E1216 14:54:58.029768 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:54:58 crc kubenswrapper[4775]: E1216 14:54:58.029800 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:02.02979226 +0000 UTC m=+26.980871183 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.056804 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-lh6xh"] Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.057291 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.060129 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.060587 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.065428 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.065983 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.066068 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.066604 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.086220 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.090751 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.108819 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.108860 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.108869 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.108910 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.108925 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:58Z","lastTransitionTime":"2025-12-16T14:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.112860 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.125168 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.129265 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/584613dc-ef95-4911-9a79-76e805e1d4d1-rootfs\") pod \"machine-config-daemon-lh6xh\" (UID: \"584613dc-ef95-4911-9a79-76e805e1d4d1\") " pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.129308 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x482d\" (UniqueName: \"kubernetes.io/projected/584613dc-ef95-4911-9a79-76e805e1d4d1-kube-api-access-x482d\") pod \"machine-config-daemon-lh6xh\" (UID: \"584613dc-ef95-4911-9a79-76e805e1d4d1\") " pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.129335 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/584613dc-ef95-4911-9a79-76e805e1d4d1-proxy-tls\") pod \"machine-config-daemon-lh6xh\" (UID: \"584613dc-ef95-4911-9a79-76e805e1d4d1\") " pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.129361 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/584613dc-ef95-4911-9a79-76e805e1d4d1-mcd-auth-proxy-config\") pod \"machine-config-daemon-lh6xh\" (UID: \"584613dc-ef95-4911-9a79-76e805e1d4d1\") " pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.129395 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.129429 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:54:58 crc kubenswrapper[4775]: E1216 14:54:58.129523 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:54:58 crc kubenswrapper[4775]: E1216 14:54:58.129553 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:54:58 crc kubenswrapper[4775]: E1216 14:54:58.129565 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:54:58 crc kubenswrapper[4775]: E1216 14:54:58.129590 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:54:58 crc kubenswrapper[4775]: E1216 14:54:58.129611 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:54:58 crc kubenswrapper[4775]: E1216 14:54:58.129627 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:54:58 crc kubenswrapper[4775]: E1216 14:54:58.129613 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:02.129595465 +0000 UTC m=+27.080674388 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:54:58 crc kubenswrapper[4775]: E1216 14:54:58.129702 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:02.129678568 +0000 UTC m=+27.080757591 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.136339 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.153323 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.163407 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.172542 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.187142 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.200534 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.211978 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.212033 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.212052 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.212077 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.212095 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:58Z","lastTransitionTime":"2025-12-16T14:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.213959 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.226761 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.229818 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/584613dc-ef95-4911-9a79-76e805e1d4d1-rootfs\") pod \"machine-config-daemon-lh6xh\" (UID: \"584613dc-ef95-4911-9a79-76e805e1d4d1\") " pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.229858 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x482d\" (UniqueName: \"kubernetes.io/projected/584613dc-ef95-4911-9a79-76e805e1d4d1-kube-api-access-x482d\") pod \"machine-config-daemon-lh6xh\" (UID: \"584613dc-ef95-4911-9a79-76e805e1d4d1\") " pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.229921 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/584613dc-ef95-4911-9a79-76e805e1d4d1-proxy-tls\") pod \"machine-config-daemon-lh6xh\" (UID: \"584613dc-ef95-4911-9a79-76e805e1d4d1\") " pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.229956 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/584613dc-ef95-4911-9a79-76e805e1d4d1-mcd-auth-proxy-config\") pod \"machine-config-daemon-lh6xh\" (UID: \"584613dc-ef95-4911-9a79-76e805e1d4d1\") " pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.230015 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/584613dc-ef95-4911-9a79-76e805e1d4d1-rootfs\") pod \"machine-config-daemon-lh6xh\" (UID: \"584613dc-ef95-4911-9a79-76e805e1d4d1\") " pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.230934 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/584613dc-ef95-4911-9a79-76e805e1d4d1-mcd-auth-proxy-config\") pod \"machine-config-daemon-lh6xh\" (UID: \"584613dc-ef95-4911-9a79-76e805e1d4d1\") " pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.234290 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/584613dc-ef95-4911-9a79-76e805e1d4d1-proxy-tls\") pod \"machine-config-daemon-lh6xh\" (UID: \"584613dc-ef95-4911-9a79-76e805e1d4d1\") " pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.240455 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.244388 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x482d\" (UniqueName: \"kubernetes.io/projected/584613dc-ef95-4911-9a79-76e805e1d4d1-kube-api-access-x482d\") pod \"machine-config-daemon-lh6xh\" (UID: \"584613dc-ef95-4911-9a79-76e805e1d4d1\") " pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.256842 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.278211 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.299701 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.314126 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.314173 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.314185 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.314202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.314215 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:58Z","lastTransitionTime":"2025-12-16T14:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.323671 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.336965 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.337076 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.337000 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:54:58 crc kubenswrapper[4775]: E1216 14:54:58.337248 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:54:58 crc kubenswrapper[4775]: E1216 14:54:58.337694 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.351812 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.361321 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.369203 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.376387 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: W1216 14:54:58.379948 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod584613dc_ef95_4911_9a79_76e805e1d4d1.slice/crio-834b7791f6b32c55bf527d7020c818c1217d2ebad92b8452509ddc2024ac7b9c WatchSource:0}: Error finding container 834b7791f6b32c55bf527d7020c818c1217d2ebad92b8452509ddc2024ac7b9c: Status 404 returned error can't find the container with id 834b7791f6b32c55bf527d7020c818c1217d2ebad92b8452509ddc2024ac7b9c Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.416236 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.416278 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.416291 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.416309 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.416321 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:58Z","lastTransitionTime":"2025-12-16T14:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.430354 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-16 14:49:57 +0000 UTC, rotation deadline is 2026-11-03 21:20:41.505026917 +0000 UTC Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.430433 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7734h25m43.074598064s for next certificate rotation Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.459915 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-hftd7"] Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.460723 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.461012 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-mc2lg"] Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.461354 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.462432 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.462700 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.462806 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.463013 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.463114 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.463228 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.466858 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.480452 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerStarted","Data":"834b7791f6b32c55bf527d7020c818c1217d2ebad92b8452509ddc2024ac7b9c"} Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.481039 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.485366 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f2p7z" event={"ID":"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5","Type":"ContainerStarted","Data":"b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184"} Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.485420 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f2p7z" event={"ID":"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5","Type":"ContainerStarted","Data":"d1b5f315e28d3741494c78fcf2cef5c64aac8b4816f2256e6b6974acfcec8490"} Dec 16 14:54:58 crc kubenswrapper[4775]: E1216 14:54:58.514777 4775 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.517932 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.527670 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.527720 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.527734 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.527761 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.527776 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:58Z","lastTransitionTime":"2025-12-16T14:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532191 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-cnibin\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532231 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-system-cni-dir\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532250 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-multus-cni-dir\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532265 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-multus-conf-dir\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532292 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-multus-socket-dir-parent\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532310 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/11f516c5-1af7-40c9-b8e2-2ce5386dce33-cnibin\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532327 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-host-var-lib-cni-multus\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532347 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-host-run-netns\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532364 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-multus-daemon-config\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532382 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j57ld\" (UniqueName: \"kubernetes.io/projected/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-kube-api-access-j57ld\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532415 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-cni-binary-copy\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532430 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-host-run-k8s-cni-cncf-io\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532453 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-os-release\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532466 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-host-var-lib-cni-bin\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532483 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-etc-kubernetes\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532497 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11f516c5-1af7-40c9-b8e2-2ce5386dce33-system-cni-dir\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532515 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/11f516c5-1af7-40c9-b8e2-2ce5386dce33-cni-binary-copy\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532533 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-hostroot\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532555 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-host-var-lib-kubelet\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532574 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11f516c5-1af7-40c9-b8e2-2ce5386dce33-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532589 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/11f516c5-1af7-40c9-b8e2-2ce5386dce33-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532603 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxz7f\" (UniqueName: \"kubernetes.io/projected/11f516c5-1af7-40c9-b8e2-2ce5386dce33-kube-api-access-dxz7f\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532622 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-host-run-multus-certs\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.532644 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/11f516c5-1af7-40c9-b8e2-2ce5386dce33-os-release\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.551590 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.573335 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.589386 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.607864 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.621500 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.629678 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.629720 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.629729 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.629743 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.629755 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:58Z","lastTransitionTime":"2025-12-16T14:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.632996 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-host-var-lib-kubelet\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633036 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11f516c5-1af7-40c9-b8e2-2ce5386dce33-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633061 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/11f516c5-1af7-40c9-b8e2-2ce5386dce33-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633086 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxz7f\" (UniqueName: \"kubernetes.io/projected/11f516c5-1af7-40c9-b8e2-2ce5386dce33-kube-api-access-dxz7f\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633109 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-host-run-multus-certs\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633114 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-host-var-lib-kubelet\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633161 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/11f516c5-1af7-40c9-b8e2-2ce5386dce33-os-release\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633198 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-host-run-multus-certs\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633247 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-cnibin\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633255 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/11f516c5-1af7-40c9-b8e2-2ce5386dce33-os-release\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633200 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-cnibin\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633298 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-system-cni-dir\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633321 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-multus-cni-dir\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633341 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-multus-conf-dir\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633364 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-multus-socket-dir-parent\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633382 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/11f516c5-1af7-40c9-b8e2-2ce5386dce33-cnibin\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633401 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-host-var-lib-cni-multus\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633421 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-host-run-netns\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633442 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-multus-daemon-config\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633463 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j57ld\" (UniqueName: \"kubernetes.io/projected/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-kube-api-access-j57ld\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633470 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-system-cni-dir\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633487 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-multus-socket-dir-parent\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633504 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-cni-binary-copy\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633531 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-host-run-k8s-cni-cncf-io\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633546 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-multus-cni-dir\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633554 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-multus-conf-dir\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633553 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-os-release\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633549 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-host-var-lib-cni-multus\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633589 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-host-var-lib-cni-bin\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633617 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-etc-kubernetes\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633628 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-host-run-k8s-cni-cncf-io\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633637 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11f516c5-1af7-40c9-b8e2-2ce5386dce33-system-cni-dir\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633509 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-host-run-netns\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633533 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/11f516c5-1af7-40c9-b8e2-2ce5386dce33-cnibin\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633659 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/11f516c5-1af7-40c9-b8e2-2ce5386dce33-cni-binary-copy\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633680 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-hostroot\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633718 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-hostroot\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633748 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-host-var-lib-cni-bin\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633441 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11f516c5-1af7-40c9-b8e2-2ce5386dce33-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633785 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-etc-kubernetes\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633594 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-os-release\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633826 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11f516c5-1af7-40c9-b8e2-2ce5386dce33-system-cni-dir\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.633868 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/11f516c5-1af7-40c9-b8e2-2ce5386dce33-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.634371 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-cni-binary-copy\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.634427 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-multus-daemon-config\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.634434 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/11f516c5-1af7-40c9-b8e2-2ce5386dce33-cni-binary-copy\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.636796 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.648983 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j57ld\" (UniqueName: \"kubernetes.io/projected/f108f76f-c79a-42b0-b5ac-714d49d9a4d5-kube-api-access-j57ld\") pod \"multus-mc2lg\" (UID: \"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\") " pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.652460 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxz7f\" (UniqueName: \"kubernetes.io/projected/11f516c5-1af7-40c9-b8e2-2ce5386dce33-kube-api-access-dxz7f\") pod \"multus-additional-cni-plugins-hftd7\" (UID: \"11f516c5-1af7-40c9-b8e2-2ce5386dce33\") " pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.658337 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.670387 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.690492 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.704595 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.716837 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.729436 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.732409 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.732448 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.732459 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.732478 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.732489 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:58Z","lastTransitionTime":"2025-12-16T14:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.752984 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.775386 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hftd7" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.779707 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.787357 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mc2lg" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.800446 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: W1216 14:54:58.805662 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf108f76f_c79a_42b0_b5ac_714d49d9a4d5.slice/crio-fc99b37e769ab7c946a70e13f1b56739a887f7badb31e3a75af23260e8b00446 WatchSource:0}: Error finding container fc99b37e769ab7c946a70e13f1b56739a887f7badb31e3a75af23260e8b00446: Status 404 returned error can't find the container with id fc99b37e769ab7c946a70e13f1b56739a887f7badb31e3a75af23260e8b00446 Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.819410 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.835356 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.835398 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.835407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.835422 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.835433 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:58Z","lastTransitionTime":"2025-12-16T14:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.840372 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.856273 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.862215 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-79w7z"] Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.863287 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.867038 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.867524 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.867961 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.868340 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.868385 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.868395 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.868924 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.873249 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.889564 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.904906 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.923271 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.937450 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-run-netns\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.937788 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-kubelet\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.937817 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.937962 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/524488dd-74ee-43ea-ac0f-5e04d59af434-env-overrides\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.938089 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-slash\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.938112 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-log-socket\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.938151 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/524488dd-74ee-43ea-ac0f-5e04d59af434-ovnkube-script-lib\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.938185 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-run-ovn\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.938224 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-node-log\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.938242 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/524488dd-74ee-43ea-ac0f-5e04d59af434-ovnkube-config\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.938273 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-cni-bin\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.938311 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-run-systemd\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.938334 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-run-openvswitch\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.938358 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-var-lib-openvswitch\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.938383 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-etc-openvswitch\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.938410 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-systemd-units\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.938431 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-run-ovn-kubernetes\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.938451 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-cni-netd\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.938471 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcjwq\" (UniqueName: \"kubernetes.io/projected/524488dd-74ee-43ea-ac0f-5e04d59af434-kube-api-access-gcjwq\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.938497 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/524488dd-74ee-43ea-ac0f-5e04d59af434-ovn-node-metrics-cert\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.942163 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.942206 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.942219 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.942236 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.942252 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:58Z","lastTransitionTime":"2025-12-16T14:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.942251 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.957069 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.978413 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:58 crc kubenswrapper[4775]: I1216 14:54:58.991073 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.003325 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.017505 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.035828 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.038995 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-kubelet\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039053 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039086 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/524488dd-74ee-43ea-ac0f-5e04d59af434-env-overrides\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039122 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-slash\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039145 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-log-socket\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039165 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039229 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-slash\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039243 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/524488dd-74ee-43ea-ac0f-5e04d59af434-ovnkube-script-lib\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039132 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-kubelet\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039297 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-log-socket\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039335 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-run-ovn\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039378 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-run-ovn\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039462 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-node-log\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039487 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/524488dd-74ee-43ea-ac0f-5e04d59af434-ovnkube-config\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039517 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-cni-bin\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039559 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-run-systemd\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039565 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-node-log\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039581 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-run-openvswitch\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039613 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-run-openvswitch\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039620 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-cni-bin\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039620 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-run-systemd\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039675 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-var-lib-openvswitch\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039708 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-etc-openvswitch\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039737 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-var-lib-openvswitch\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039747 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-systemd-units\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039779 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-etc-openvswitch\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039779 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-run-ovn-kubernetes\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039814 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-run-ovn-kubernetes\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039835 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-cni-netd\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039818 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-systemd-units\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039818 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-cni-netd\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039875 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcjwq\" (UniqueName: \"kubernetes.io/projected/524488dd-74ee-43ea-ac0f-5e04d59af434-kube-api-access-gcjwq\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039919 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/524488dd-74ee-43ea-ac0f-5e04d59af434-ovn-node-metrics-cert\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039940 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-run-netns\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.039947 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/524488dd-74ee-43ea-ac0f-5e04d59af434-env-overrides\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.040039 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-run-netns\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.040234 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/524488dd-74ee-43ea-ac0f-5e04d59af434-ovnkube-script-lib\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.040392 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/524488dd-74ee-43ea-ac0f-5e04d59af434-ovnkube-config\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.044791 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/524488dd-74ee-43ea-ac0f-5e04d59af434-ovn-node-metrics-cert\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.045318 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.045364 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.045380 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.045401 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.045417 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:59Z","lastTransitionTime":"2025-12-16T14:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.050689 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.059158 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcjwq\" (UniqueName: \"kubernetes.io/projected/524488dd-74ee-43ea-ac0f-5e04d59af434-kube-api-access-gcjwq\") pod \"ovnkube-node-79w7z\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.062902 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.077118 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.093336 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.115603 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.147197 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.147236 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.147249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.147264 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.147275 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:59Z","lastTransitionTime":"2025-12-16T14:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.249435 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.249471 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.249484 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.249501 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.249514 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:59Z","lastTransitionTime":"2025-12-16T14:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.257203 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.265067 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.267597 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.280366 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.293096 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.303598 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.313302 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.324754 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.335356 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.337664 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:54:59 crc kubenswrapper[4775]: E1216 14:54:59.337802 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.351191 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.351653 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.351829 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.351849 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.351923 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.351943 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:59Z","lastTransitionTime":"2025-12-16T14:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.363262 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.383571 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.397189 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.418046 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.419121 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.434724 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: W1216 14:54:59.438063 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod524488dd_74ee_43ea_ac0f_5e04d59af434.slice/crio-2ef64c25ce5ae2d4b03af0088361e90d21c0e774f0b7e35e863b36c08e80df16 WatchSource:0}: Error finding container 2ef64c25ce5ae2d4b03af0088361e90d21c0e774f0b7e35e863b36c08e80df16: Status 404 returned error can't find the container with id 2ef64c25ce5ae2d4b03af0088361e90d21c0e774f0b7e35e863b36c08e80df16 Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.446993 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.454260 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.454320 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.454334 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.454357 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.454403 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:59Z","lastTransitionTime":"2025-12-16T14:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.476214 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.492727 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mc2lg" event={"ID":"f108f76f-c79a-42b0-b5ac-714d49d9a4d5","Type":"ContainerStarted","Data":"e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7"} Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.492794 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mc2lg" event={"ID":"f108f76f-c79a-42b0-b5ac-714d49d9a4d5","Type":"ContainerStarted","Data":"fc99b37e769ab7c946a70e13f1b56739a887f7badb31e3a75af23260e8b00446"} Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.494366 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" event={"ID":"11f516c5-1af7-40c9-b8e2-2ce5386dce33","Type":"ContainerStarted","Data":"2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500"} Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.494402 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" event={"ID":"11f516c5-1af7-40c9-b8e2-2ce5386dce33","Type":"ContainerStarted","Data":"d6b711f491f711d5e3fab1ae56b2f09b0b261d8173335ebab7f3f258505ecb03"} Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.495740 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerStarted","Data":"e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67"} Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.495767 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerStarted","Data":"b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26"} Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.498263 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerStarted","Data":"2ef64c25ce5ae2d4b03af0088361e90d21c0e774f0b7e35e863b36c08e80df16"} Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.515498 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.557275 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.557329 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.557339 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.557359 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.557376 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:59Z","lastTransitionTime":"2025-12-16T14:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.563513 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.610571 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.658816 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.660670 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.660702 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.660714 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.660731 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.660742 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:59Z","lastTransitionTime":"2025-12-16T14:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.683258 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.722252 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.762005 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.764223 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.764279 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.764290 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.764314 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.764325 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:59Z","lastTransitionTime":"2025-12-16T14:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.799480 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.836082 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.867876 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.867955 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.867971 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.867995 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.868009 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:59Z","lastTransitionTime":"2025-12-16T14:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.879008 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.920144 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.957459 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.971458 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.971506 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.971518 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.971536 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.971549 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:54:59Z","lastTransitionTime":"2025-12-16T14:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:54:59 crc kubenswrapper[4775]: I1216 14:54:59.996435 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:54:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.037809 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.073597 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.073900 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.073966 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.074040 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.074110 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:00Z","lastTransitionTime":"2025-12-16T14:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.076852 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.114805 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.159473 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.176705 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.176748 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.176757 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.176773 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.176784 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:00Z","lastTransitionTime":"2025-12-16T14:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.196684 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.240762 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.276736 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.279088 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.279152 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.279164 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.279182 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.279194 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:00Z","lastTransitionTime":"2025-12-16T14:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.317453 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.337776 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.337854 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:00 crc kubenswrapper[4775]: E1216 14:55:00.337949 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:00 crc kubenswrapper[4775]: E1216 14:55:00.338283 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.356134 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.381307 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.381347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.381359 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.381374 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.381384 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:00Z","lastTransitionTime":"2025-12-16T14:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.404277 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.405920 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-47t7r"] Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.406543 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-47t7r" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.427919 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.448077 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.454982 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0ef8da9e-565b-40c0-a37d-f4f44c552912-serviceca\") pod \"node-ca-47t7r\" (UID: \"0ef8da9e-565b-40c0-a37d-f4f44c552912\") " pod="openshift-image-registry/node-ca-47t7r" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.455039 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0ef8da9e-565b-40c0-a37d-f4f44c552912-host\") pod \"node-ca-47t7r\" (UID: \"0ef8da9e-565b-40c0-a37d-f4f44c552912\") " pod="openshift-image-registry/node-ca-47t7r" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.455071 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flz7d\" (UniqueName: \"kubernetes.io/projected/0ef8da9e-565b-40c0-a37d-f4f44c552912-kube-api-access-flz7d\") pod \"node-ca-47t7r\" (UID: \"0ef8da9e-565b-40c0-a37d-f4f44c552912\") " pod="openshift-image-registry/node-ca-47t7r" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.467765 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.484206 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.484246 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.484278 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.484296 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.484306 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:00Z","lastTransitionTime":"2025-12-16T14:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.489179 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.502496 4775 generic.go:334] "Generic (PLEG): container finished" podID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerID="fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79" exitCode=0 Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.502558 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerDied","Data":"fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79"} Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.504451 4775 generic.go:334] "Generic (PLEG): container finished" podID="11f516c5-1af7-40c9-b8e2-2ce5386dce33" containerID="2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500" exitCode=0 Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.504573 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" event={"ID":"11f516c5-1af7-40c9-b8e2-2ce5386dce33","Type":"ContainerDied","Data":"2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500"} Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.535440 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.556202 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0ef8da9e-565b-40c0-a37d-f4f44c552912-serviceca\") pod \"node-ca-47t7r\" (UID: \"0ef8da9e-565b-40c0-a37d-f4f44c552912\") " pod="openshift-image-registry/node-ca-47t7r" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.556256 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flz7d\" (UniqueName: \"kubernetes.io/projected/0ef8da9e-565b-40c0-a37d-f4f44c552912-kube-api-access-flz7d\") pod \"node-ca-47t7r\" (UID: \"0ef8da9e-565b-40c0-a37d-f4f44c552912\") " pod="openshift-image-registry/node-ca-47t7r" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.556281 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0ef8da9e-565b-40c0-a37d-f4f44c552912-host\") pod \"node-ca-47t7r\" (UID: \"0ef8da9e-565b-40c0-a37d-f4f44c552912\") " pod="openshift-image-registry/node-ca-47t7r" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.557240 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0ef8da9e-565b-40c0-a37d-f4f44c552912-host\") pod \"node-ca-47t7r\" (UID: \"0ef8da9e-565b-40c0-a37d-f4f44c552912\") " pod="openshift-image-registry/node-ca-47t7r" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.557629 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.558409 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0ef8da9e-565b-40c0-a37d-f4f44c552912-serviceca\") pod \"node-ca-47t7r\" (UID: \"0ef8da9e-565b-40c0-a37d-f4f44c552912\") " pod="openshift-image-registry/node-ca-47t7r" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.589467 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.589528 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.589541 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.590294 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.590325 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:00Z","lastTransitionTime":"2025-12-16T14:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.597625 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flz7d\" (UniqueName: \"kubernetes.io/projected/0ef8da9e-565b-40c0-a37d-f4f44c552912-kube-api-access-flz7d\") pod \"node-ca-47t7r\" (UID: \"0ef8da9e-565b-40c0-a37d-f4f44c552912\") " pod="openshift-image-registry/node-ca-47t7r" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.620494 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.661127 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.692579 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.692625 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.692639 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.692656 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.692669 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:00Z","lastTransitionTime":"2025-12-16T14:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.703795 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.719957 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-47t7r" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.740527 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: W1216 14:55:00.743155 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ef8da9e_565b_40c0_a37d_f4f44c552912.slice/crio-e0aa7c572b9430fd71e8ff8534f341563971a5c28d7b678ffacb83861d3fa088 WatchSource:0}: Error finding container e0aa7c572b9430fd71e8ff8534f341563971a5c28d7b678ffacb83861d3fa088: Status 404 returned error can't find the container with id e0aa7c572b9430fd71e8ff8534f341563971a5c28d7b678ffacb83861d3fa088 Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.776281 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.796804 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.796846 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.796857 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.796909 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.796922 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:00Z","lastTransitionTime":"2025-12-16T14:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.815423 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.857605 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.901239 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.901303 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.901316 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.901366 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.901440 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:00Z","lastTransitionTime":"2025-12-16T14:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.947303 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.964078 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:00 crc kubenswrapper[4775]: I1216 14:55:00.980928 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:00Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.009057 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.009125 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.009138 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.009154 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.009182 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:01Z","lastTransitionTime":"2025-12-16T14:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.019428 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.057901 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.095433 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.111476 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.111521 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.111533 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.111550 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.111561 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:01Z","lastTransitionTime":"2025-12-16T14:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.144831 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.177491 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.214643 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.214698 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.214713 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.214732 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.214746 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:01Z","lastTransitionTime":"2025-12-16T14:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.217169 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.254778 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.317267 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.317318 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.317331 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.317351 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.317363 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:01Z","lastTransitionTime":"2025-12-16T14:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.337544 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:01 crc kubenswrapper[4775]: E1216 14:55:01.337713 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.420200 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.420250 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.420264 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.420284 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.420298 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:01Z","lastTransitionTime":"2025-12-16T14:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.508150 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-47t7r" event={"ID":"0ef8da9e-565b-40c0-a37d-f4f44c552912","Type":"ContainerStarted","Data":"e0aa7c572b9430fd71e8ff8534f341563971a5c28d7b678ffacb83861d3fa088"} Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.509773 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" event={"ID":"11f516c5-1af7-40c9-b8e2-2ce5386dce33","Type":"ContainerStarted","Data":"989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b"} Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.512111 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerStarted","Data":"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c"} Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.512144 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerStarted","Data":"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814"} Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.522388 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.522431 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.522441 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.522456 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.522467 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:01Z","lastTransitionTime":"2025-12-16T14:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.523846 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.541963 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.554110 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.567503 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.579106 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.597330 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.621035 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.625148 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.625200 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.625215 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.625236 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.625249 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:01Z","lastTransitionTime":"2025-12-16T14:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.642539 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.655518 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.677941 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.703367 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.727078 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.727122 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.727132 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.727148 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.727157 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:01Z","lastTransitionTime":"2025-12-16T14:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.736304 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.777655 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.818106 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.829403 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.829450 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.829462 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.829480 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.829492 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:01Z","lastTransitionTime":"2025-12-16T14:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.857845 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.932791 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.932860 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.932881 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.932950 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:01 crc kubenswrapper[4775]: I1216 14:55:01.932971 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:01Z","lastTransitionTime":"2025-12-16T14:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.035763 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.035837 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.035854 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.035880 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.035923 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:02Z","lastTransitionTime":"2025-12-16T14:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.073067 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.073254 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:02 crc kubenswrapper[4775]: E1216 14:55:02.073292 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:55:10.073257005 +0000 UTC m=+35.024335938 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.073346 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:02 crc kubenswrapper[4775]: E1216 14:55:02.073405 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:55:02 crc kubenswrapper[4775]: E1216 14:55:02.073492 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:55:02 crc kubenswrapper[4775]: E1216 14:55:02.073518 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:10.073491583 +0000 UTC m=+35.024570546 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:55:02 crc kubenswrapper[4775]: E1216 14:55:02.073552 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:10.073538954 +0000 UTC m=+35.024617887 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.139492 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.139526 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.139538 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.139552 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.139561 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:02Z","lastTransitionTime":"2025-12-16T14:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.174659 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.174723 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:02 crc kubenswrapper[4775]: E1216 14:55:02.174862 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:55:02 crc kubenswrapper[4775]: E1216 14:55:02.174915 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:55:02 crc kubenswrapper[4775]: E1216 14:55:02.174927 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:55:02 crc kubenswrapper[4775]: E1216 14:55:02.174937 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:55:02 crc kubenswrapper[4775]: E1216 14:55:02.175026 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:55:02 crc kubenswrapper[4775]: E1216 14:55:02.175079 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:55:02 crc kubenswrapper[4775]: E1216 14:55:02.174986 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:10.174968011 +0000 UTC m=+35.126046934 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:55:02 crc kubenswrapper[4775]: E1216 14:55:02.175228 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:10.175194378 +0000 UTC m=+35.126273331 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.241917 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.241976 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.241995 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.242019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.242038 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:02Z","lastTransitionTime":"2025-12-16T14:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.337153 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.337228 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:02 crc kubenswrapper[4775]: E1216 14:55:02.337348 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:02 crc kubenswrapper[4775]: E1216 14:55:02.337481 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.345835 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.345881 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.345918 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.345937 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.345950 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:02Z","lastTransitionTime":"2025-12-16T14:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.448299 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.448358 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.448373 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.448397 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.448412 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:02Z","lastTransitionTime":"2025-12-16T14:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.518432 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-47t7r" event={"ID":"0ef8da9e-565b-40c0-a37d-f4f44c552912","Type":"ContainerStarted","Data":"0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8"} Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.525682 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerStarted","Data":"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555"} Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.528225 4775 generic.go:334] "Generic (PLEG): container finished" podID="11f516c5-1af7-40c9-b8e2-2ce5386dce33" containerID="989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b" exitCode=0 Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.528285 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" event={"ID":"11f516c5-1af7-40c9-b8e2-2ce5386dce33","Type":"ContainerDied","Data":"989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b"} Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.536982 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.552548 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.552761 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.552873 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.553095 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.553209 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:02Z","lastTransitionTime":"2025-12-16T14:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.553235 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.568379 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.580126 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.605080 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.617467 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.631431 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.644417 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.655615 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.655650 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.655658 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.655670 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.655680 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:02Z","lastTransitionTime":"2025-12-16T14:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.664448 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.679065 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.691366 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.705124 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.716396 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.730708 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.743503 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.757684 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.758627 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.758656 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.758664 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.758678 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.758688 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:02Z","lastTransitionTime":"2025-12-16T14:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.772539 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.787282 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.806406 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.821134 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.833363 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.843054 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.859341 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.860836 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.860874 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.860901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.860917 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.860926 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:02Z","lastTransitionTime":"2025-12-16T14:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.873152 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.888354 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.904617 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.951211 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.963092 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.963127 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.963142 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.963161 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.963175 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:02Z","lastTransitionTime":"2025-12-16T14:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:02 crc kubenswrapper[4775]: I1216 14:55:02.981558 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.015058 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.059478 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.066045 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.066084 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.066095 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.066112 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.066124 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:03Z","lastTransitionTime":"2025-12-16T14:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.168754 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.168815 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.168826 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.168845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.168856 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:03Z","lastTransitionTime":"2025-12-16T14:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.271128 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.271195 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.271214 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.271238 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.271267 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:03Z","lastTransitionTime":"2025-12-16T14:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.337816 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:03 crc kubenswrapper[4775]: E1216 14:55:03.338003 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.373334 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.373375 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.373386 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.373402 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.373413 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:03Z","lastTransitionTime":"2025-12-16T14:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.476303 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.476344 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.476355 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.476372 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.476384 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:03Z","lastTransitionTime":"2025-12-16T14:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.536497 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerStarted","Data":"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683"} Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.539400 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" event={"ID":"11f516c5-1af7-40c9-b8e2-2ce5386dce33","Type":"ContainerStarted","Data":"60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724"} Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.579455 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.579539 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.579557 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.579585 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.579605 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:03Z","lastTransitionTime":"2025-12-16T14:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.682397 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.682440 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.682451 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.682469 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.682480 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:03Z","lastTransitionTime":"2025-12-16T14:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.786134 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.786220 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.786248 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.786297 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.786324 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:03Z","lastTransitionTime":"2025-12-16T14:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.889399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.889647 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.889656 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.889670 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.889680 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:03Z","lastTransitionTime":"2025-12-16T14:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.992805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.992848 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.992860 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.992877 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:03 crc kubenswrapper[4775]: I1216 14:55:03.992913 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:03Z","lastTransitionTime":"2025-12-16T14:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.095815 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.095958 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.095984 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.096016 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.096040 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:04Z","lastTransitionTime":"2025-12-16T14:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.198545 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.198578 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.198587 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.198601 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.198612 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:04Z","lastTransitionTime":"2025-12-16T14:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.301484 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.301536 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.301547 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.301752 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.301763 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:04Z","lastTransitionTime":"2025-12-16T14:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.337241 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.337299 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:04 crc kubenswrapper[4775]: E1216 14:55:04.337489 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:04 crc kubenswrapper[4775]: E1216 14:55:04.337589 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.403941 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.403984 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.403997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.404020 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.404035 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:04Z","lastTransitionTime":"2025-12-16T14:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.506289 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.506359 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.506378 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.506406 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.506424 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:04Z","lastTransitionTime":"2025-12-16T14:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.546083 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerStarted","Data":"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c"} Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.546134 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerStarted","Data":"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583"} Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.548208 4775 generic.go:334] "Generic (PLEG): container finished" podID="11f516c5-1af7-40c9-b8e2-2ce5386dce33" containerID="60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724" exitCode=0 Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.548245 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" event={"ID":"11f516c5-1af7-40c9-b8e2-2ce5386dce33","Type":"ContainerDied","Data":"60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724"} Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.564652 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.597117 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.609377 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.609433 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.609448 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.609469 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.609484 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:04Z","lastTransitionTime":"2025-12-16T14:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.610647 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.625115 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.643332 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.655358 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.667488 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.677694 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.689973 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.702597 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.712270 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.712302 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.712311 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.712327 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.712365 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:04Z","lastTransitionTime":"2025-12-16T14:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.715536 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.725998 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.741534 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.753420 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.769629 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.815134 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.815193 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.815210 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.815231 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.815246 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:04Z","lastTransitionTime":"2025-12-16T14:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.917546 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.917601 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.917614 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.917634 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:04 crc kubenswrapper[4775]: I1216 14:55:04.917647 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:04Z","lastTransitionTime":"2025-12-16T14:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.020504 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.020541 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.020553 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.020572 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.020583 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:05Z","lastTransitionTime":"2025-12-16T14:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.124668 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.124722 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.124740 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.124763 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.124782 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:05Z","lastTransitionTime":"2025-12-16T14:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.205206 4775 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.227699 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.227728 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.227737 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.227751 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.227761 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:05Z","lastTransitionTime":"2025-12-16T14:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.330421 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.330464 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.330481 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.330498 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.330511 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:05Z","lastTransitionTime":"2025-12-16T14:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.337057 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:05 crc kubenswrapper[4775]: E1216 14:55:05.337194 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.351593 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.376023 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.388313 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.399411 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.412705 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.429651 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.433238 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.433272 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.433281 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.433297 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.433309 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:05Z","lastTransitionTime":"2025-12-16T14:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.448224 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.461671 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.472386 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.487003 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.501035 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.511719 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.522502 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.533306 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.535530 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.535554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.535562 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.535576 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.535586 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:05Z","lastTransitionTime":"2025-12-16T14:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.551629 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.553291 4775 generic.go:334] "Generic (PLEG): container finished" podID="11f516c5-1af7-40c9-b8e2-2ce5386dce33" containerID="a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0" exitCode=0 Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.553329 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" event={"ID":"11f516c5-1af7-40c9-b8e2-2ce5386dce33","Type":"ContainerDied","Data":"a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0"} Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.584685 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.598001 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.639694 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.639734 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.639745 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.639765 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.639778 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:05Z","lastTransitionTime":"2025-12-16T14:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.642713 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.654683 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.666558 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.680216 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.694855 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.708685 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.722085 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.738514 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.741997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.742027 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.742036 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.742050 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.742068 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:05Z","lastTransitionTime":"2025-12-16T14:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.756574 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.771146 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.786729 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.801865 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.816866 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.845336 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.845388 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.845397 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.845415 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.845429 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:05Z","lastTransitionTime":"2025-12-16T14:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.947897 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.947937 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.947951 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.947971 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:05 crc kubenswrapper[4775]: I1216 14:55:05.947986 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:05Z","lastTransitionTime":"2025-12-16T14:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.050909 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.051433 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.051450 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.051470 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.051485 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:06Z","lastTransitionTime":"2025-12-16T14:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.154162 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.154234 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.154254 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.154286 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.154305 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:06Z","lastTransitionTime":"2025-12-16T14:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.257753 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.257841 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.257861 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.257940 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.257967 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:06Z","lastTransitionTime":"2025-12-16T14:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.337238 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.337273 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:06 crc kubenswrapper[4775]: E1216 14:55:06.337446 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:06 crc kubenswrapper[4775]: E1216 14:55:06.337625 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.361981 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.362068 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.362092 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.362125 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.362152 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:06Z","lastTransitionTime":"2025-12-16T14:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.464976 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.465025 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.465163 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.465182 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.465193 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:06Z","lastTransitionTime":"2025-12-16T14:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.562391 4775 generic.go:334] "Generic (PLEG): container finished" podID="11f516c5-1af7-40c9-b8e2-2ce5386dce33" containerID="12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159" exitCode=0 Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.562457 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" event={"ID":"11f516c5-1af7-40c9-b8e2-2ce5386dce33","Type":"ContainerDied","Data":"12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159"} Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.568796 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.568864 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.568924 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.568955 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.568976 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:06Z","lastTransitionTime":"2025-12-16T14:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.589772 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:06Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.609673 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:06Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.638971 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:06Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.657329 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:06Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.671270 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.671485 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.671617 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.671744 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.671878 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:06Z","lastTransitionTime":"2025-12-16T14:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.682037 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:06Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.696804 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:06Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.715919 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:06Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.732414 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:06Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.750615 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:06Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.763480 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:06Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.774657 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.774703 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.774713 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.774728 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.774738 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:06Z","lastTransitionTime":"2025-12-16T14:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.780451 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:06Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.794198 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:06Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.805034 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:06Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.831159 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:06Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.845820 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:06Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.876735 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.876765 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.876774 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.876788 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:06 crc kubenswrapper[4775]: I1216 14:55:06.876799 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:06Z","lastTransitionTime":"2025-12-16T14:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.004064 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.004119 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.004135 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.004160 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.004177 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:07Z","lastTransitionTime":"2025-12-16T14:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.006378 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.006416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.006432 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.006452 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.006466 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:07Z","lastTransitionTime":"2025-12-16T14:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:07 crc kubenswrapper[4775]: E1216 14:55:07.033510 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:07Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.041626 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.041670 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.041687 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.041710 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.041728 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:07Z","lastTransitionTime":"2025-12-16T14:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:07 crc kubenswrapper[4775]: E1216 14:55:07.069205 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:07Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.072905 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.072963 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.072974 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.072997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.073016 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:07Z","lastTransitionTime":"2025-12-16T14:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:07 crc kubenswrapper[4775]: E1216 14:55:07.090817 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:07Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.094378 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.094423 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.094437 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.094456 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.094469 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:07Z","lastTransitionTime":"2025-12-16T14:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:07 crc kubenswrapper[4775]: E1216 14:55:07.108261 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:07Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.113306 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.113364 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.113373 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.113393 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.113406 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:07Z","lastTransitionTime":"2025-12-16T14:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:07 crc kubenswrapper[4775]: E1216 14:55:07.128126 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:07Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:07 crc kubenswrapper[4775]: E1216 14:55:07.128306 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.130103 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.130142 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.130153 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.130171 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.130183 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:07Z","lastTransitionTime":"2025-12-16T14:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.233738 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.233796 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.233810 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.233832 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.233846 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:07Z","lastTransitionTime":"2025-12-16T14:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.336256 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.336325 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.336342 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.336368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.336385 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:07Z","lastTransitionTime":"2025-12-16T14:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.336976 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:07 crc kubenswrapper[4775]: E1216 14:55:07.337135 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.440006 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.440100 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.440120 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.440147 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.440169 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:07Z","lastTransitionTime":"2025-12-16T14:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.543967 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.544052 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.544076 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.544107 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.544125 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:07Z","lastTransitionTime":"2025-12-16T14:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.578809 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerStarted","Data":"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3"} Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.646061 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.646102 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.646114 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.646135 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.646149 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:07Z","lastTransitionTime":"2025-12-16T14:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.748400 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.748449 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.748473 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.748499 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.748514 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:07Z","lastTransitionTime":"2025-12-16T14:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.851321 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.851408 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.851441 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.851474 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.851496 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:07Z","lastTransitionTime":"2025-12-16T14:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.954269 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.954349 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.954361 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.954379 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:07 crc kubenswrapper[4775]: I1216 14:55:07.954392 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:07Z","lastTransitionTime":"2025-12-16T14:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.057259 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.057310 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.057327 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.057348 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.057362 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:08Z","lastTransitionTime":"2025-12-16T14:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.160238 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.160291 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.160303 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.160320 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.160331 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:08Z","lastTransitionTime":"2025-12-16T14:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.263000 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.263103 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.263136 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.263167 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.263186 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:08Z","lastTransitionTime":"2025-12-16T14:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.337392 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.337443 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:08 crc kubenswrapper[4775]: E1216 14:55:08.337636 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:08 crc kubenswrapper[4775]: E1216 14:55:08.337760 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.365868 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.365973 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.365992 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.366019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.366038 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:08Z","lastTransitionTime":"2025-12-16T14:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.469268 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.469314 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.469328 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.469344 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.469355 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:08Z","lastTransitionTime":"2025-12-16T14:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.572654 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.572709 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.572726 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.572750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.572766 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:08Z","lastTransitionTime":"2025-12-16T14:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.586293 4775 generic.go:334] "Generic (PLEG): container finished" podID="11f516c5-1af7-40c9-b8e2-2ce5386dce33" containerID="916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1" exitCode=0 Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.586388 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" event={"ID":"11f516c5-1af7-40c9-b8e2-2ce5386dce33","Type":"ContainerDied","Data":"916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1"} Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.605808 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.629564 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.640670 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.662765 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.675815 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.675860 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.675872 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.675929 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.675944 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:08Z","lastTransitionTime":"2025-12-16T14:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.678094 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.696759 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.709357 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.722606 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.735302 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.745740 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.763165 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.775412 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.779161 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.779222 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.779238 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.779260 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.779271 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:08Z","lastTransitionTime":"2025-12-16T14:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.786754 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.797799 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.807526 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.882507 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.882559 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.882568 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.882581 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.882590 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:08Z","lastTransitionTime":"2025-12-16T14:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.985626 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.985679 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.985691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.985709 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:08 crc kubenswrapper[4775]: I1216 14:55:08.985723 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:08Z","lastTransitionTime":"2025-12-16T14:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.087919 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.087981 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.087998 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.088022 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.088041 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:09Z","lastTransitionTime":"2025-12-16T14:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.191125 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.191176 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.191191 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.191210 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.191225 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:09Z","lastTransitionTime":"2025-12-16T14:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.294202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.294232 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.294241 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.294255 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.294265 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:09Z","lastTransitionTime":"2025-12-16T14:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.337272 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:09 crc kubenswrapper[4775]: E1216 14:55:09.337441 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.397204 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.397249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.397266 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.397286 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.397300 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:09Z","lastTransitionTime":"2025-12-16T14:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.500836 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.500916 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.500932 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.500955 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.500973 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:09Z","lastTransitionTime":"2025-12-16T14:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.593785 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" event={"ID":"11f516c5-1af7-40c9-b8e2-2ce5386dce33","Type":"ContainerStarted","Data":"0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899"} Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.603720 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.603770 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.603783 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.603804 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.603817 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:09Z","lastTransitionTime":"2025-12-16T14:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.609378 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.630941 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.647374 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.663660 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.675730 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.694077 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.706283 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.707196 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.707260 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.707298 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.707312 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:09Z","lastTransitionTime":"2025-12-16T14:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.713844 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.729445 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.745262 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.759396 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.772022 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.782525 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.801426 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.811419 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.811449 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.811459 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.811476 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.811487 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:09Z","lastTransitionTime":"2025-12-16T14:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.814489 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.825724 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.913990 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.914031 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.914042 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.914060 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:09 crc kubenswrapper[4775]: I1216 14:55:09.914070 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:09Z","lastTransitionTime":"2025-12-16T14:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.016458 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.016504 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.016517 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.016534 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.016547 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:10Z","lastTransitionTime":"2025-12-16T14:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.093027 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:55:10 crc kubenswrapper[4775]: E1216 14:55:10.093281 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:55:26.093240391 +0000 UTC m=+51.044319344 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.093504 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.093545 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:10 crc kubenswrapper[4775]: E1216 14:55:10.093655 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:55:10 crc kubenswrapper[4775]: E1216 14:55:10.093728 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:26.093707906 +0000 UTC m=+51.044786889 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:55:10 crc kubenswrapper[4775]: E1216 14:55:10.093729 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:55:10 crc kubenswrapper[4775]: E1216 14:55:10.093815 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:26.093796998 +0000 UTC m=+51.044875951 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.120042 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.120089 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.120100 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.120117 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.120129 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:10Z","lastTransitionTime":"2025-12-16T14:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.194527 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.194634 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:10 crc kubenswrapper[4775]: E1216 14:55:10.194788 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:55:10 crc kubenswrapper[4775]: E1216 14:55:10.194862 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:55:10 crc kubenswrapper[4775]: E1216 14:55:10.194880 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:55:10 crc kubenswrapper[4775]: E1216 14:55:10.194812 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:55:10 crc kubenswrapper[4775]: E1216 14:55:10.194971 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:55:10 crc kubenswrapper[4775]: E1216 14:55:10.194991 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:55:10 crc kubenswrapper[4775]: E1216 14:55:10.195016 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:26.194990547 +0000 UTC m=+51.146069500 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:55:10 crc kubenswrapper[4775]: E1216 14:55:10.195060 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:26.195035869 +0000 UTC m=+51.146114832 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.223133 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.223185 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.223198 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.223213 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.223223 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:10Z","lastTransitionTime":"2025-12-16T14:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.325261 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.325307 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.325317 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.325333 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.325344 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:10Z","lastTransitionTime":"2025-12-16T14:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.337158 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.337168 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:10 crc kubenswrapper[4775]: E1216 14:55:10.337390 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:10 crc kubenswrapper[4775]: E1216 14:55:10.337468 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.429026 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.429080 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.429093 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.429110 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.429120 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:10Z","lastTransitionTime":"2025-12-16T14:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.532496 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.532550 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.532565 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.532587 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.532604 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:10Z","lastTransitionTime":"2025-12-16T14:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.603109 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerStarted","Data":"028f8743408bac43d8cd8fb2663da7bcc80969c5fce34edfe15f09ca946fb8da"} Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.618026 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.630742 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.634830 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.635012 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.635143 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.635255 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.635356 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:10Z","lastTransitionTime":"2025-12-16T14:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.653842 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.669611 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.683587 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.701716 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.726321 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://028f8743408bac43d8cd8fb2663da7bcc80969c5fce34edfe15f09ca946fb8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.738831 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.738928 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.738955 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.738986 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.739009 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:10Z","lastTransitionTime":"2025-12-16T14:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.743410 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.763495 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.775195 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.790521 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.807353 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.823667 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.840139 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.841845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.841879 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.841906 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.841924 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.841937 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:10Z","lastTransitionTime":"2025-12-16T14:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.857250 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.895352 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg"] Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.896057 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.898083 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.898755 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.913808 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.927081 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.944832 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.944919 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.944934 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.944954 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.944966 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:10Z","lastTransitionTime":"2025-12-16T14:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.951608 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://028f8743408bac43d8cd8fb2663da7bcc80969c5fce34edfe15f09ca946fb8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.978970 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:10 crc kubenswrapper[4775]: I1216 14:55:10.992542 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.004875 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.005560 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/06c229d1-beab-4662-96c5-e458d6cd3e83-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jv9gg\" (UID: \"06c229d1-beab-4662-96c5-e458d6cd3e83\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.005658 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cs7c\" (UniqueName: \"kubernetes.io/projected/06c229d1-beab-4662-96c5-e458d6cd3e83-kube-api-access-9cs7c\") pod \"ovnkube-control-plane-749d76644c-jv9gg\" (UID: \"06c229d1-beab-4662-96c5-e458d6cd3e83\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.005706 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/06c229d1-beab-4662-96c5-e458d6cd3e83-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jv9gg\" (UID: \"06c229d1-beab-4662-96c5-e458d6cd3e83\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.005798 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/06c229d1-beab-4662-96c5-e458d6cd3e83-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jv9gg\" (UID: \"06c229d1-beab-4662-96c5-e458d6cd3e83\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.020354 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.034436 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.047466 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.047632 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.047927 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.047936 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.047951 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.047961 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:11Z","lastTransitionTime":"2025-12-16T14:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.064045 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.076281 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.087422 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.098035 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.107220 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/06c229d1-beab-4662-96c5-e458d6cd3e83-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jv9gg\" (UID: \"06c229d1-beab-4662-96c5-e458d6cd3e83\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.107423 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cs7c\" (UniqueName: \"kubernetes.io/projected/06c229d1-beab-4662-96c5-e458d6cd3e83-kube-api-access-9cs7c\") pod \"ovnkube-control-plane-749d76644c-jv9gg\" (UID: \"06c229d1-beab-4662-96c5-e458d6cd3e83\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.107588 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/06c229d1-beab-4662-96c5-e458d6cd3e83-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jv9gg\" (UID: \"06c229d1-beab-4662-96c5-e458d6cd3e83\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.107715 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/06c229d1-beab-4662-96c5-e458d6cd3e83-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jv9gg\" (UID: \"06c229d1-beab-4662-96c5-e458d6cd3e83\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.108094 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/06c229d1-beab-4662-96c5-e458d6cd3e83-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jv9gg\" (UID: \"06c229d1-beab-4662-96c5-e458d6cd3e83\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.108239 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/06c229d1-beab-4662-96c5-e458d6cd3e83-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jv9gg\" (UID: \"06c229d1-beab-4662-96c5-e458d6cd3e83\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.111836 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.112932 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/06c229d1-beab-4662-96c5-e458d6cd3e83-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jv9gg\" (UID: \"06c229d1-beab-4662-96c5-e458d6cd3e83\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.122374 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cs7c\" (UniqueName: \"kubernetes.io/projected/06c229d1-beab-4662-96c5-e458d6cd3e83-kube-api-access-9cs7c\") pod \"ovnkube-control-plane-749d76644c-jv9gg\" (UID: \"06c229d1-beab-4662-96c5-e458d6cd3e83\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.134515 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.146473 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.150475 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.150638 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.150718 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.150796 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.150865 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:11Z","lastTransitionTime":"2025-12-16T14:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.212259 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" Dec 16 14:55:11 crc kubenswrapper[4775]: W1216 14:55:11.230526 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06c229d1_beab_4662_96c5_e458d6cd3e83.slice/crio-c2b03dc2ba7b12fab746dcca5324d46397822b5cc361efcc5a2219f6529f136d WatchSource:0}: Error finding container c2b03dc2ba7b12fab746dcca5324d46397822b5cc361efcc5a2219f6529f136d: Status 404 returned error can't find the container with id c2b03dc2ba7b12fab746dcca5324d46397822b5cc361efcc5a2219f6529f136d Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.255170 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.255236 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.255249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.255269 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.255280 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:11Z","lastTransitionTime":"2025-12-16T14:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.337841 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:11 crc kubenswrapper[4775]: E1216 14:55:11.338024 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.358042 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.358082 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.358093 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.358111 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.358124 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:11Z","lastTransitionTime":"2025-12-16T14:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.460420 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.460463 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.460477 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.460494 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.460506 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:11Z","lastTransitionTime":"2025-12-16T14:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.563975 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.564037 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.564056 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.564079 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.564096 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:11Z","lastTransitionTime":"2025-12-16T14:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.607466 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" event={"ID":"06c229d1-beab-4662-96c5-e458d6cd3e83","Type":"ContainerStarted","Data":"c2b03dc2ba7b12fab746dcca5324d46397822b5cc361efcc5a2219f6529f136d"} Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.607521 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.608036 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.608101 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.648764 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.648939 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.664966 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.666973 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.667007 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.667016 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.667032 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.667042 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:11Z","lastTransitionTime":"2025-12-16T14:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.679777 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.706670 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.722610 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.734828 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.744163 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.761732 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://028f8743408bac43d8cd8fb2663da7bcc80969c5fce34edfe15f09ca946fb8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.770061 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.770127 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.770139 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.770165 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.770177 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:11Z","lastTransitionTime":"2025-12-16T14:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.776343 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.787060 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.796400 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.813432 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.829773 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.845811 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.860212 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.872713 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.872748 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.872757 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.872773 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.872784 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:11Z","lastTransitionTime":"2025-12-16T14:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.873587 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.887945 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.905652 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.917213 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.942571 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://028f8743408bac43d8cd8fb2663da7bcc80969c5fce34edfe15f09ca946fb8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.973200 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.975163 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.975246 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.975271 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.975303 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.975329 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:11Z","lastTransitionTime":"2025-12-16T14:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:11 crc kubenswrapper[4775]: I1216 14:55:11.993680 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.007723 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.021318 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.038609 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.053754 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.073261 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.077695 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.077744 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.077754 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.077770 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.077783 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:12Z","lastTransitionTime":"2025-12-16T14:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.088447 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.100528 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.111506 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.124501 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.145278 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.161340 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.181531 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.181582 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.181598 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.181621 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.181637 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:12Z","lastTransitionTime":"2025-12-16T14:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.232870 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.252593 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.267710 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.283943 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.283999 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.284012 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.284030 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.284041 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:12Z","lastTransitionTime":"2025-12-16T14:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.286634 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.308875 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.326708 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.337265 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:12 crc kubenswrapper[4775]: E1216 14:55:12.337577 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.337443 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:12 crc kubenswrapper[4775]: E1216 14:55:12.338029 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.347515 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.359148 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.365309 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-c6mdt"] Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.365914 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:12 crc kubenswrapper[4775]: E1216 14:55:12.365998 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.370707 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.381606 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.386497 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.386548 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.386560 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.386577 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.386588 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:12Z","lastTransitionTime":"2025-12-16T14:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.391555 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.405186 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.416682 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.433355 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.447025 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.458611 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.489503 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.489537 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.489546 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.489559 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.489571 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:12Z","lastTransitionTime":"2025-12-16T14:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.491540 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://028f8743408bac43d8cd8fb2663da7bcc80969c5fce34edfe15f09ca946fb8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.518186 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.526185 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs\") pod \"network-metrics-daemon-c6mdt\" (UID: \"3d592ae8-792f-4cc5-9a32-b278deb33810\") " pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.526247 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrwzx\" (UniqueName: \"kubernetes.io/projected/3d592ae8-792f-4cc5-9a32-b278deb33810-kube-api-access-nrwzx\") pod \"network-metrics-daemon-c6mdt\" (UID: \"3d592ae8-792f-4cc5-9a32-b278deb33810\") " pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.541419 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.561951 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.577488 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.592105 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.592154 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.592165 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.592183 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.592196 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:12Z","lastTransitionTime":"2025-12-16T14:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.596051 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.608661 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.611925 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovnkube-controller/0.log" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.614265 4775 generic.go:334] "Generic (PLEG): container finished" podID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerID="028f8743408bac43d8cd8fb2663da7bcc80969c5fce34edfe15f09ca946fb8da" exitCode=1 Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.614315 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerDied","Data":"028f8743408bac43d8cd8fb2663da7bcc80969c5fce34edfe15f09ca946fb8da"} Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.614869 4775 scope.go:117] "RemoveContainer" containerID="028f8743408bac43d8cd8fb2663da7bcc80969c5fce34edfe15f09ca946fb8da" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.616913 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" event={"ID":"06c229d1-beab-4662-96c5-e458d6cd3e83","Type":"ContainerStarted","Data":"11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c"} Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.616950 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" event={"ID":"06c229d1-beab-4662-96c5-e458d6cd3e83","Type":"ContainerStarted","Data":"c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0"} Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.620469 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.627195 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs\") pod \"network-metrics-daemon-c6mdt\" (UID: \"3d592ae8-792f-4cc5-9a32-b278deb33810\") " pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.627245 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrwzx\" (UniqueName: \"kubernetes.io/projected/3d592ae8-792f-4cc5-9a32-b278deb33810-kube-api-access-nrwzx\") pod \"network-metrics-daemon-c6mdt\" (UID: \"3d592ae8-792f-4cc5-9a32-b278deb33810\") " pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:12 crc kubenswrapper[4775]: E1216 14:55:12.627568 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:55:12 crc kubenswrapper[4775]: E1216 14:55:12.627631 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs podName:3d592ae8-792f-4cc5-9a32-b278deb33810 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:13.127611107 +0000 UTC m=+38.078690040 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs") pod "network-metrics-daemon-c6mdt" (UID: "3d592ae8-792f-4cc5-9a32-b278deb33810") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.632280 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.641789 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.646711 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrwzx\" (UniqueName: \"kubernetes.io/projected/3d592ae8-792f-4cc5-9a32-b278deb33810-kube-api-access-nrwzx\") pod \"network-metrics-daemon-c6mdt\" (UID: \"3d592ae8-792f-4cc5-9a32-b278deb33810\") " pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.653282 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.665128 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.673269 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.684939 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.695530 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.695565 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.695574 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.695589 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.695602 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:12Z","lastTransitionTime":"2025-12-16T14:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.696375 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.716826 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://028f8743408bac43d8cd8fb2663da7bcc80969c5fce34edfe15f09ca946fb8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.726801 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.740322 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.765184 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.779070 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.792005 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.798124 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.798184 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.798198 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.798219 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.798231 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:12Z","lastTransitionTime":"2025-12-16T14:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.803865 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.814025 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.825430 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.839772 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.853403 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.895736 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.900615 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.900644 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.900653 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.900665 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.900674 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:12Z","lastTransitionTime":"2025-12-16T14:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.940200 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:12 crc kubenswrapper[4775]: I1216 14:55:12.975593 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.002784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.002820 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.002845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.002860 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.002870 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:13Z","lastTransitionTime":"2025-12-16T14:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.019022 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://028f8743408bac43d8cd8fb2663da7bcc80969c5fce34edfe15f09ca946fb8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://028f8743408bac43d8cd8fb2663da7bcc80969c5fce34edfe15f09ca946fb8da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"9 6158 admin_network_policy_namespace.go:53] Processing sync for Namespace openshift-etcd-operator in Admin Network Policy controller\\\\nI1216 14:55:12.563483 6158 admin_network_policy_namespace.go:56] Finished syncing Namespace openshift-etcd-operator Admin Network Policy controller: took 3.56µs\\\\nI1216 14:55:12.563488 6158 admin_network_policy_namespace.go:53] Processing sync for Namespace openshift-operator-lifecycle-manager in Admin Network Policy controller\\\\nI1216 14:55:12.563491 6158 admin_network_policy_namespace.go:56] Finished syncing Namespace openshift-operator-lifecycle-manager Admin Network Policy controller: took 3.31µs\\\\nI1216 14:55:12.563496 6158 admin_network_policy_namespace.go:53] Processing sync for Namespace openshift-ingress-canary in Admin Network Policy controller\\\\nI1216 14:55:12.563499 6158 admin_network_policy_namespace.go:56] Finished syncing Namespace openshift-ingress-canary Admin Network Policy controller: took 3.26µs\\\\nI1216 14:55:12.563674 6158 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1216 14:55:12.563749 6158 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1216 14:55:12.563796 6158 ovnkube.go:599] Stopped ovnkube\\\\nI1216 14:55:12.563820 6158 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1216 14:55:12.563878 6158 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.052960 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.095634 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.105869 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.105933 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.105945 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.105965 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.105980 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:13Z","lastTransitionTime":"2025-12-16T14:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.134178 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.136561 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs\") pod \"network-metrics-daemon-c6mdt\" (UID: \"3d592ae8-792f-4cc5-9a32-b278deb33810\") " pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:13 crc kubenswrapper[4775]: E1216 14:55:13.136698 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:55:13 crc kubenswrapper[4775]: E1216 14:55:13.136762 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs podName:3d592ae8-792f-4cc5-9a32-b278deb33810 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:14.136745789 +0000 UTC m=+39.087824712 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs") pod "network-metrics-daemon-c6mdt" (UID: "3d592ae8-792f-4cc5-9a32-b278deb33810") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.173848 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.208283 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.208343 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.208359 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.208383 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.208396 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:13Z","lastTransitionTime":"2025-12-16T14:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.214068 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.311185 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.311232 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.311245 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.311264 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.311276 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:13Z","lastTransitionTime":"2025-12-16T14:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.337615 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:13 crc kubenswrapper[4775]: E1216 14:55:13.337763 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.414072 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.414126 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.414139 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.414157 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.414170 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:13Z","lastTransitionTime":"2025-12-16T14:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.517275 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.517329 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.517339 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.517357 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.517368 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:13Z","lastTransitionTime":"2025-12-16T14:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.619147 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.619202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.619213 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.619230 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.619242 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:13Z","lastTransitionTime":"2025-12-16T14:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.622413 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovnkube-controller/0.log" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.624933 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerStarted","Data":"d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38"} Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.624970 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.637378 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.646511 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.668671 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.681346 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.696594 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.712476 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.721826 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.721857 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.721866 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.721882 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.721906 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:13Z","lastTransitionTime":"2025-12-16T14:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.740389 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://028f8743408bac43d8cd8fb2663da7bcc80969c5fce34edfe15f09ca946fb8da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"9 6158 admin_network_policy_namespace.go:53] Processing sync for Namespace openshift-etcd-operator in Admin Network Policy controller\\\\nI1216 14:55:12.563483 6158 admin_network_policy_namespace.go:56] Finished syncing Namespace openshift-etcd-operator Admin Network Policy controller: took 3.56µs\\\\nI1216 14:55:12.563488 6158 admin_network_policy_namespace.go:53] Processing sync for Namespace openshift-operator-lifecycle-manager in Admin Network Policy controller\\\\nI1216 14:55:12.563491 6158 admin_network_policy_namespace.go:56] Finished syncing Namespace openshift-operator-lifecycle-manager Admin Network Policy controller: took 3.31µs\\\\nI1216 14:55:12.563496 6158 admin_network_policy_namespace.go:53] Processing sync for Namespace openshift-ingress-canary in Admin Network Policy controller\\\\nI1216 14:55:12.563499 6158 admin_network_policy_namespace.go:56] Finished syncing Namespace openshift-ingress-canary Admin Network Policy controller: took 3.26µs\\\\nI1216 14:55:12.563674 6158 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1216 14:55:12.563749 6158 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1216 14:55:12.563796 6158 ovnkube.go:599] Stopped ovnkube\\\\nI1216 14:55:12.563820 6158 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1216 14:55:12.563878 6158 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.753767 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.774026 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.790407 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.805390 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.823563 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.824445 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.824480 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.824492 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.824511 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.824525 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:13Z","lastTransitionTime":"2025-12-16T14:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.848217 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.863505 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.880930 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.893926 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.913137 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:13Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.928000 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.928118 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.928212 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.928294 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:13 crc kubenswrapper[4775]: I1216 14:55:13.928365 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:13Z","lastTransitionTime":"2025-12-16T14:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.031560 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.031850 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.031861 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.031876 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.031919 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:14Z","lastTransitionTime":"2025-12-16T14:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.134415 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.134458 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.134467 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.134481 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.134492 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:14Z","lastTransitionTime":"2025-12-16T14:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.147230 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs\") pod \"network-metrics-daemon-c6mdt\" (UID: \"3d592ae8-792f-4cc5-9a32-b278deb33810\") " pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:14 crc kubenswrapper[4775]: E1216 14:55:14.147441 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:55:14 crc kubenswrapper[4775]: E1216 14:55:14.147519 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs podName:3d592ae8-792f-4cc5-9a32-b278deb33810 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:16.147500367 +0000 UTC m=+41.098579290 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs") pod "network-metrics-daemon-c6mdt" (UID: "3d592ae8-792f-4cc5-9a32-b278deb33810") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.237688 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.237755 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.237782 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.237842 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.237869 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:14Z","lastTransitionTime":"2025-12-16T14:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.337569 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.337630 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.337572 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:14 crc kubenswrapper[4775]: E1216 14:55:14.337718 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:14 crc kubenswrapper[4775]: E1216 14:55:14.337829 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:14 crc kubenswrapper[4775]: E1216 14:55:14.337917 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.340009 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.340060 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.340072 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.340092 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.340104 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:14Z","lastTransitionTime":"2025-12-16T14:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.442997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.443095 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.443110 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.443126 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.443442 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:14Z","lastTransitionTime":"2025-12-16T14:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.545812 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.546162 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.546253 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.546335 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.546419 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:14Z","lastTransitionTime":"2025-12-16T14:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.630213 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovnkube-controller/1.log" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.630871 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovnkube-controller/0.log" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.634144 4775 generic.go:334] "Generic (PLEG): container finished" podID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerID="d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38" exitCode=1 Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.634208 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerDied","Data":"d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38"} Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.634258 4775 scope.go:117] "RemoveContainer" containerID="028f8743408bac43d8cd8fb2663da7bcc80969c5fce34edfe15f09ca946fb8da" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.635208 4775 scope.go:117] "RemoveContainer" containerID="d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38" Dec 16 14:55:14 crc kubenswrapper[4775]: E1216 14:55:14.635554 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.648327 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.648370 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.648411 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.648428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.648436 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:14Z","lastTransitionTime":"2025-12-16T14:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.649296 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:14Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.663967 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:14Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.677390 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:14Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.686028 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:14Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.698771 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:14Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.711183 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:14Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.733067 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://028f8743408bac43d8cd8fb2663da7bcc80969c5fce34edfe15f09ca946fb8da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"9 6158 admin_network_policy_namespace.go:53] Processing sync for Namespace openshift-etcd-operator in Admin Network Policy controller\\\\nI1216 14:55:12.563483 6158 admin_network_policy_namespace.go:56] Finished syncing Namespace openshift-etcd-operator Admin Network Policy controller: took 3.56µs\\\\nI1216 14:55:12.563488 6158 admin_network_policy_namespace.go:53] Processing sync for Namespace openshift-operator-lifecycle-manager in Admin Network Policy controller\\\\nI1216 14:55:12.563491 6158 admin_network_policy_namespace.go:56] Finished syncing Namespace openshift-operator-lifecycle-manager Admin Network Policy controller: took 3.31µs\\\\nI1216 14:55:12.563496 6158 admin_network_policy_namespace.go:53] Processing sync for Namespace openshift-ingress-canary in Admin Network Policy controller\\\\nI1216 14:55:12.563499 6158 admin_network_policy_namespace.go:56] Finished syncing Namespace openshift-ingress-canary Admin Network Policy controller: took 3.26µs\\\\nI1216 14:55:12.563674 6158 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1216 14:55:12.563749 6158 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1216 14:55:12.563796 6158 ovnkube.go:599] Stopped ovnkube\\\\nI1216 14:55:12.563820 6158 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1216 14:55:12.563878 6158 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:13Z\\\",\\\"message\\\":\\\"twork_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1216 14:55:13.334843 6322 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 14:55:13.334558 6322 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 14:55:13.334882 6322 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:14Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.744130 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:14Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.750521 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.750550 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.750558 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.750570 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.750579 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:14Z","lastTransitionTime":"2025-12-16T14:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.757909 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:14Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.770944 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:14Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.784730 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:14Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.795741 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:14Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.814841 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:14Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.826722 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:14Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.838101 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:14Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.848194 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:14Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.852874 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.852940 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.852956 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.852976 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.852989 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:14Z","lastTransitionTime":"2025-12-16T14:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.857284 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:14Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.955364 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.955423 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.955440 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.955464 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:14 crc kubenswrapper[4775]: I1216 14:55:14.955481 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:14Z","lastTransitionTime":"2025-12-16T14:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.057815 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.057871 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.057903 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.057925 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.057938 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:15Z","lastTransitionTime":"2025-12-16T14:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.160458 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.160538 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.160549 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.160570 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.160582 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:15Z","lastTransitionTime":"2025-12-16T14:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.262670 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.262738 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.262750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.262769 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.262782 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:15Z","lastTransitionTime":"2025-12-16T14:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.337834 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:15 crc kubenswrapper[4775]: E1216 14:55:15.338016 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.359737 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.365276 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.365306 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.365318 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.365334 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.365348 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:15Z","lastTransitionTime":"2025-12-16T14:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.370366 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.380972 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.397871 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.410070 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.423830 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.436462 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.448177 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.462840 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.467750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.467793 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.467806 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.467822 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.467834 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:15Z","lastTransitionTime":"2025-12-16T14:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.478062 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.498350 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://028f8743408bac43d8cd8fb2663da7bcc80969c5fce34edfe15f09ca946fb8da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"9 6158 admin_network_policy_namespace.go:53] Processing sync for Namespace openshift-etcd-operator in Admin Network Policy controller\\\\nI1216 14:55:12.563483 6158 admin_network_policy_namespace.go:56] Finished syncing Namespace openshift-etcd-operator Admin Network Policy controller: took 3.56µs\\\\nI1216 14:55:12.563488 6158 admin_network_policy_namespace.go:53] Processing sync for Namespace openshift-operator-lifecycle-manager in Admin Network Policy controller\\\\nI1216 14:55:12.563491 6158 admin_network_policy_namespace.go:56] Finished syncing Namespace openshift-operator-lifecycle-manager Admin Network Policy controller: took 3.31µs\\\\nI1216 14:55:12.563496 6158 admin_network_policy_namespace.go:53] Processing sync for Namespace openshift-ingress-canary in Admin Network Policy controller\\\\nI1216 14:55:12.563499 6158 admin_network_policy_namespace.go:56] Finished syncing Namespace openshift-ingress-canary Admin Network Policy controller: took 3.26µs\\\\nI1216 14:55:12.563674 6158 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1216 14:55:12.563749 6158 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1216 14:55:12.563796 6158 ovnkube.go:599] Stopped ovnkube\\\\nI1216 14:55:12.563820 6158 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1216 14:55:12.563878 6158 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:13Z\\\",\\\"message\\\":\\\"twork_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1216 14:55:13.334843 6322 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 14:55:13.334558 6322 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 14:55:13.334882 6322 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.510844 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.524775 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.540874 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.554790 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.570140 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.570218 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.570230 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.570255 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.570270 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:15Z","lastTransitionTime":"2025-12-16T14:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.573560 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.588119 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.640709 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovnkube-controller/1.log" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.645505 4775 scope.go:117] "RemoveContainer" containerID="d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38" Dec 16 14:55:15 crc kubenswrapper[4775]: E1216 14:55:15.645773 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.692948 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.692990 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.693001 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.693015 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.693024 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:15Z","lastTransitionTime":"2025-12-16T14:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.713483 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.726661 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.748801 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.766476 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.786129 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:13Z\\\",\\\"message\\\":\\\"twork_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1216 14:55:13.334843 6322 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 14:55:13.334558 6322 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 14:55:13.334882 6322 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.795687 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.795723 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.795737 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.795753 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.795764 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:15Z","lastTransitionTime":"2025-12-16T14:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.799836 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.814958 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.828827 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.841320 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.856265 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.871298 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.894179 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.898358 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.898585 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.898692 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.898814 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.898926 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:15Z","lastTransitionTime":"2025-12-16T14:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.909432 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.922322 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.933963 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.944726 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:15 crc kubenswrapper[4775]: I1216 14:55:15.956405 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.001284 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.001320 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.001331 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.001348 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.001360 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:16Z","lastTransitionTime":"2025-12-16T14:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.103372 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.103414 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.103426 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.103442 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.103451 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:16Z","lastTransitionTime":"2025-12-16T14:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.167439 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs\") pod \"network-metrics-daemon-c6mdt\" (UID: \"3d592ae8-792f-4cc5-9a32-b278deb33810\") " pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:16 crc kubenswrapper[4775]: E1216 14:55:16.167592 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:55:16 crc kubenswrapper[4775]: E1216 14:55:16.167659 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs podName:3d592ae8-792f-4cc5-9a32-b278deb33810 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:20.167635859 +0000 UTC m=+45.118714792 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs") pod "network-metrics-daemon-c6mdt" (UID: "3d592ae8-792f-4cc5-9a32-b278deb33810") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.206296 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.206349 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.206358 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.206372 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.206381 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:16Z","lastTransitionTime":"2025-12-16T14:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.309466 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.309549 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.309582 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.309612 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.309634 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:16Z","lastTransitionTime":"2025-12-16T14:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.337859 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:16 crc kubenswrapper[4775]: E1216 14:55:16.338277 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.337969 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:16 crc kubenswrapper[4775]: E1216 14:55:16.338509 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.337935 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:16 crc kubenswrapper[4775]: E1216 14:55:16.338681 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.412382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.412424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.412433 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.412448 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.412457 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:16Z","lastTransitionTime":"2025-12-16T14:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.514955 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.515005 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.515013 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.515027 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.515036 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:16Z","lastTransitionTime":"2025-12-16T14:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.618020 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.618074 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.618088 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.618107 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.618119 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:16Z","lastTransitionTime":"2025-12-16T14:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.722204 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.722250 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.722265 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.722285 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.722296 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:16Z","lastTransitionTime":"2025-12-16T14:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.824472 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.824762 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.824835 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.824927 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.825004 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:16Z","lastTransitionTime":"2025-12-16T14:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.927942 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.928202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.928280 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.928411 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:16 crc kubenswrapper[4775]: I1216 14:55:16.928496 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:16Z","lastTransitionTime":"2025-12-16T14:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.031328 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.031367 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.031377 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.031425 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.031435 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:17Z","lastTransitionTime":"2025-12-16T14:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.132351 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.132391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.132403 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.132421 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.132435 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:17Z","lastTransitionTime":"2025-12-16T14:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:17 crc kubenswrapper[4775]: E1216 14:55:17.145834 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:17Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.150273 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.150342 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.150355 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.150373 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.150401 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:17Z","lastTransitionTime":"2025-12-16T14:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:17 crc kubenswrapper[4775]: E1216 14:55:17.172093 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:17Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.178876 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.178954 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.178964 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.178981 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.178991 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:17Z","lastTransitionTime":"2025-12-16T14:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:17 crc kubenswrapper[4775]: E1216 14:55:17.193149 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:17Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.197306 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.197352 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.197365 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.197385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.197397 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:17Z","lastTransitionTime":"2025-12-16T14:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:17 crc kubenswrapper[4775]: E1216 14:55:17.209200 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:17Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.213402 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.213448 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.213457 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.213474 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.213486 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:17Z","lastTransitionTime":"2025-12-16T14:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:17 crc kubenswrapper[4775]: E1216 14:55:17.226186 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:17Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:17 crc kubenswrapper[4775]: E1216 14:55:17.226306 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.228011 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.228073 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.228088 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.228107 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.228120 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:17Z","lastTransitionTime":"2025-12-16T14:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.330939 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.330991 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.331005 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.331020 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.331030 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:17Z","lastTransitionTime":"2025-12-16T14:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.337307 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:17 crc kubenswrapper[4775]: E1216 14:55:17.337423 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.434236 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.434691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.434845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.435021 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.435145 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:17Z","lastTransitionTime":"2025-12-16T14:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.538325 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.538386 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.538429 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.538454 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.538468 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:17Z","lastTransitionTime":"2025-12-16T14:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.641288 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.641624 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.641731 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.641841 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.641974 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:17Z","lastTransitionTime":"2025-12-16T14:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.744964 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.745015 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.745026 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.745043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.745054 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:17Z","lastTransitionTime":"2025-12-16T14:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.848010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.848062 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.848074 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.848092 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.848106 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:17Z","lastTransitionTime":"2025-12-16T14:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.950667 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.950732 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.950751 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.950775 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:17 crc kubenswrapper[4775]: I1216 14:55:17.951024 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:17Z","lastTransitionTime":"2025-12-16T14:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.053855 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.053958 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.053968 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.053988 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.054002 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:18Z","lastTransitionTime":"2025-12-16T14:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.156766 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.156820 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.156831 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.156847 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.156859 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:18Z","lastTransitionTime":"2025-12-16T14:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.260527 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.260757 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.260879 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.260981 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.260997 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:18Z","lastTransitionTime":"2025-12-16T14:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.336870 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.336971 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:18 crc kubenswrapper[4775]: E1216 14:55:18.337082 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.337207 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:18 crc kubenswrapper[4775]: E1216 14:55:18.337223 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:18 crc kubenswrapper[4775]: E1216 14:55:18.337409 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.363481 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.363533 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.363544 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.363563 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.363575 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:18Z","lastTransitionTime":"2025-12-16T14:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.466228 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.466298 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.466315 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.466338 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.466357 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:18Z","lastTransitionTime":"2025-12-16T14:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.569326 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.569363 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.569374 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.569390 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.569400 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:18Z","lastTransitionTime":"2025-12-16T14:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.671268 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.671309 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.671321 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.671336 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.671347 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:18Z","lastTransitionTime":"2025-12-16T14:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.774345 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.774461 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.774481 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.774507 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.774527 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:18Z","lastTransitionTime":"2025-12-16T14:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.877527 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.877587 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.877603 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.877621 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.877636 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:18Z","lastTransitionTime":"2025-12-16T14:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.980401 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.980459 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.980477 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.980500 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:18 crc kubenswrapper[4775]: I1216 14:55:18.980517 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:18Z","lastTransitionTime":"2025-12-16T14:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.083315 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.083393 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.083416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.083445 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.083470 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:19Z","lastTransitionTime":"2025-12-16T14:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.186347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.186391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.186399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.186416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.186426 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:19Z","lastTransitionTime":"2025-12-16T14:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.289475 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.289573 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.289592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.289617 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.289639 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:19Z","lastTransitionTime":"2025-12-16T14:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.337759 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:19 crc kubenswrapper[4775]: E1216 14:55:19.338040 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.392306 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.392368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.392385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.392407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.392424 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:19Z","lastTransitionTime":"2025-12-16T14:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.495364 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.495426 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.495443 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.495468 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.495487 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:19Z","lastTransitionTime":"2025-12-16T14:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.598725 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.598783 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.598801 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.598824 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.598842 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:19Z","lastTransitionTime":"2025-12-16T14:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.701595 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.701658 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.701676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.701704 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.701722 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:19Z","lastTransitionTime":"2025-12-16T14:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.805204 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.805252 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.805267 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.805290 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.805304 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:19Z","lastTransitionTime":"2025-12-16T14:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.908808 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.908918 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.908941 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.908971 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:19 crc kubenswrapper[4775]: I1216 14:55:19.908993 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:19Z","lastTransitionTime":"2025-12-16T14:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.012565 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.012663 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.012706 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.012743 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.012771 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:20Z","lastTransitionTime":"2025-12-16T14:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.116372 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.116418 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.116433 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.116456 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.116472 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:20Z","lastTransitionTime":"2025-12-16T14:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.219500 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.219535 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.219544 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.219558 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.219566 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:20Z","lastTransitionTime":"2025-12-16T14:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.231573 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs\") pod \"network-metrics-daemon-c6mdt\" (UID: \"3d592ae8-792f-4cc5-9a32-b278deb33810\") " pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:20 crc kubenswrapper[4775]: E1216 14:55:20.231952 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:55:20 crc kubenswrapper[4775]: E1216 14:55:20.232120 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs podName:3d592ae8-792f-4cc5-9a32-b278deb33810 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:28.232078699 +0000 UTC m=+53.183157672 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs") pod "network-metrics-daemon-c6mdt" (UID: "3d592ae8-792f-4cc5-9a32-b278deb33810") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.322504 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.322556 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.322572 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.322597 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.322614 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:20Z","lastTransitionTime":"2025-12-16T14:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.342162 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.342298 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.342409 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:20 crc kubenswrapper[4775]: E1216 14:55:20.342508 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:20 crc kubenswrapper[4775]: E1216 14:55:20.342629 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:20 crc kubenswrapper[4775]: E1216 14:55:20.342734 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.426540 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.426605 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.426629 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.426659 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.426686 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:20Z","lastTransitionTime":"2025-12-16T14:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.529666 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.529740 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.529763 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.529793 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.529815 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:20Z","lastTransitionTime":"2025-12-16T14:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.633202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.633259 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.633274 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.633293 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.633307 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:20Z","lastTransitionTime":"2025-12-16T14:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.735761 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.735813 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.735826 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.735850 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.735866 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:20Z","lastTransitionTime":"2025-12-16T14:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.838021 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.838087 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.838097 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.838115 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.838126 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:20Z","lastTransitionTime":"2025-12-16T14:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.940191 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.940490 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.940556 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.940622 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:20 crc kubenswrapper[4775]: I1216 14:55:20.940678 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:20Z","lastTransitionTime":"2025-12-16T14:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.042710 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.042768 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.042781 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.042798 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.042809 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:21Z","lastTransitionTime":"2025-12-16T14:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.145564 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.145874 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.145987 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.146055 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.146131 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:21Z","lastTransitionTime":"2025-12-16T14:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.249149 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.249263 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.249284 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.249313 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.249334 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:21Z","lastTransitionTime":"2025-12-16T14:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.337244 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:21 crc kubenswrapper[4775]: E1216 14:55:21.337437 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.352802 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.352850 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.352868 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.352929 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.352956 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:21Z","lastTransitionTime":"2025-12-16T14:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.455302 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.455364 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.455381 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.455404 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.455423 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:21Z","lastTransitionTime":"2025-12-16T14:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.558493 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.558559 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.558581 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.558609 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.558631 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:21Z","lastTransitionTime":"2025-12-16T14:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.661338 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.661396 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.661409 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.661428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.661443 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:21Z","lastTransitionTime":"2025-12-16T14:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.764738 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.764795 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.764814 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.764837 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.764855 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:21Z","lastTransitionTime":"2025-12-16T14:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.867717 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.867812 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.867831 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.867857 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.867875 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:21Z","lastTransitionTime":"2025-12-16T14:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.970980 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.971046 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.971064 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.971089 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:21 crc kubenswrapper[4775]: I1216 14:55:21.971106 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:21Z","lastTransitionTime":"2025-12-16T14:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.074209 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.074293 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.074317 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.074352 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.074380 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:22Z","lastTransitionTime":"2025-12-16T14:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.177359 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.177435 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.177451 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.177480 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.177500 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:22Z","lastTransitionTime":"2025-12-16T14:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.280669 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.280736 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.280752 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.280775 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.280792 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:22Z","lastTransitionTime":"2025-12-16T14:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.337112 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.337156 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.337251 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:22 crc kubenswrapper[4775]: E1216 14:55:22.337326 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:22 crc kubenswrapper[4775]: E1216 14:55:22.337438 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:22 crc kubenswrapper[4775]: E1216 14:55:22.337504 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.383733 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.383824 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.383848 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.383883 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.383946 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:22Z","lastTransitionTime":"2025-12-16T14:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.486435 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.486493 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.486505 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.486527 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.486538 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:22Z","lastTransitionTime":"2025-12-16T14:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.589932 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.590005 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.590019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.590043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.590065 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:22Z","lastTransitionTime":"2025-12-16T14:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.662669 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.663675 4775 scope.go:117] "RemoveContainer" containerID="d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38" Dec 16 14:55:22 crc kubenswrapper[4775]: E1216 14:55:22.663870 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.692712 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.692771 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.692784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.692805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.692820 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:22Z","lastTransitionTime":"2025-12-16T14:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.795301 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.795369 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.795382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.795399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.795412 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:22Z","lastTransitionTime":"2025-12-16T14:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.897820 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.897862 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.897874 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.897902 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:22 crc kubenswrapper[4775]: I1216 14:55:22.897914 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:22Z","lastTransitionTime":"2025-12-16T14:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.000422 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.000468 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.000479 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.000499 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.000512 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:23Z","lastTransitionTime":"2025-12-16T14:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.102974 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.103010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.103023 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.103074 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.103088 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:23Z","lastTransitionTime":"2025-12-16T14:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.204752 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.204784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.204793 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.204806 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.204816 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:23Z","lastTransitionTime":"2025-12-16T14:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.309116 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.309174 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.309185 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.309203 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.309218 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:23Z","lastTransitionTime":"2025-12-16T14:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.336986 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:23 crc kubenswrapper[4775]: E1216 14:55:23.337116 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.411834 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.411964 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.411980 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.412006 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.412022 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:23Z","lastTransitionTime":"2025-12-16T14:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.515024 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.515086 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.515097 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.515116 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.515129 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:23Z","lastTransitionTime":"2025-12-16T14:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.618092 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.618157 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.618180 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.618208 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.618223 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:23Z","lastTransitionTime":"2025-12-16T14:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.721778 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.721836 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.721854 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.721878 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.721927 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:23Z","lastTransitionTime":"2025-12-16T14:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.824226 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.824281 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.824296 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.824314 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.824328 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:23Z","lastTransitionTime":"2025-12-16T14:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.926863 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.926932 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.926950 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.926966 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:23 crc kubenswrapper[4775]: I1216 14:55:23.926977 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:23Z","lastTransitionTime":"2025-12-16T14:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.029205 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.029272 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.029334 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.029360 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.029379 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:24Z","lastTransitionTime":"2025-12-16T14:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.133485 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.133554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.133577 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.133609 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.133634 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:24Z","lastTransitionTime":"2025-12-16T14:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.237129 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.237208 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.237226 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.237251 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.237271 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:24Z","lastTransitionTime":"2025-12-16T14:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.337304 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.337402 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:24 crc kubenswrapper[4775]: E1216 14:55:24.337464 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.337544 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:24 crc kubenswrapper[4775]: E1216 14:55:24.337682 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:24 crc kubenswrapper[4775]: E1216 14:55:24.337826 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.339215 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.339253 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.339265 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.339340 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.339377 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:24Z","lastTransitionTime":"2025-12-16T14:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.442290 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.442335 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.442343 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.442358 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.442368 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:24Z","lastTransitionTime":"2025-12-16T14:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.544590 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.544623 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.544631 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.544644 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.544653 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:24Z","lastTransitionTime":"2025-12-16T14:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.647220 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.647262 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.647275 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.647292 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.647303 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:24Z","lastTransitionTime":"2025-12-16T14:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.750396 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.750434 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.750448 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.750467 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.750478 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:24Z","lastTransitionTime":"2025-12-16T14:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.853539 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.853594 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.853605 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.853618 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.853627 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:24Z","lastTransitionTime":"2025-12-16T14:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.957099 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.957152 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.957164 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.957178 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:24 crc kubenswrapper[4775]: I1216 14:55:24.957187 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:24Z","lastTransitionTime":"2025-12-16T14:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.059093 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.059146 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.059167 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.059191 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.059209 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:25Z","lastTransitionTime":"2025-12-16T14:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.162030 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.162105 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.162121 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.162147 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.162163 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:25Z","lastTransitionTime":"2025-12-16T14:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.265660 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.265724 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.265737 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.265755 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.265767 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:25Z","lastTransitionTime":"2025-12-16T14:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.337585 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:25 crc kubenswrapper[4775]: E1216 14:55:25.337802 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.354031 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.367033 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.368230 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.368277 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.368286 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.368303 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.368315 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:25Z","lastTransitionTime":"2025-12-16T14:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.379673 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.393459 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.413057 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.431757 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:13Z\\\",\\\"message\\\":\\\"twork_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1216 14:55:13.334843 6322 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 14:55:13.334558 6322 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 14:55:13.334882 6322 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.445261 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.456363 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.470560 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.470712 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.470741 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.470774 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.470792 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:25Z","lastTransitionTime":"2025-12-16T14:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.472580 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.492177 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.509126 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.530811 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.547452 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.561951 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.571657 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.573850 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.574114 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.574134 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.574156 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.574167 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:25Z","lastTransitionTime":"2025-12-16T14:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.583459 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.596036 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.676676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.676715 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.676723 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.676737 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.676747 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:25Z","lastTransitionTime":"2025-12-16T14:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.778867 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.778908 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.778917 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.778932 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.778941 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:25Z","lastTransitionTime":"2025-12-16T14:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.881429 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.881488 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.881501 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.881521 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.881533 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:25Z","lastTransitionTime":"2025-12-16T14:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.983784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.983822 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.983834 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.983853 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:25 crc kubenswrapper[4775]: I1216 14:55:25.983867 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:25Z","lastTransitionTime":"2025-12-16T14:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.086956 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.087055 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.087081 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.087112 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.087134 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:26Z","lastTransitionTime":"2025-12-16T14:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.094400 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.094506 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.094569 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:26 crc kubenswrapper[4775]: E1216 14:55:26.094718 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:55:26 crc kubenswrapper[4775]: E1216 14:55:26.094748 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:55:58.094712037 +0000 UTC m=+83.045791000 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:55:26 crc kubenswrapper[4775]: E1216 14:55:26.094759 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:55:26 crc kubenswrapper[4775]: E1216 14:55:26.094794 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:58.094774978 +0000 UTC m=+83.045853941 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:55:26 crc kubenswrapper[4775]: E1216 14:55:26.094946 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:58.094916652 +0000 UTC m=+83.045995615 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.190036 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.190093 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.190110 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.190138 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.190154 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:26Z","lastTransitionTime":"2025-12-16T14:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.195397 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.195841 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:26 crc kubenswrapper[4775]: E1216 14:55:26.195685 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:55:26 crc kubenswrapper[4775]: E1216 14:55:26.195958 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:55:26 crc kubenswrapper[4775]: E1216 14:55:26.195981 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:55:26 crc kubenswrapper[4775]: E1216 14:55:26.196070 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:58.196043119 +0000 UTC m=+83.147122072 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:55:26 crc kubenswrapper[4775]: E1216 14:55:26.196111 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:55:26 crc kubenswrapper[4775]: E1216 14:55:26.196170 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:55:26 crc kubenswrapper[4775]: E1216 14:55:26.196193 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:55:26 crc kubenswrapper[4775]: E1216 14:55:26.196262 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:58.196236985 +0000 UTC m=+83.147315948 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.293460 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.293537 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.293556 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.293578 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.293599 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:26Z","lastTransitionTime":"2025-12-16T14:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.337400 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.337400 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.337504 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:26 crc kubenswrapper[4775]: E1216 14:55:26.337666 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:26 crc kubenswrapper[4775]: E1216 14:55:26.337799 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:26 crc kubenswrapper[4775]: E1216 14:55:26.338002 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.396938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.397060 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.397123 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.397160 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.397220 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:26Z","lastTransitionTime":"2025-12-16T14:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.501148 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.501260 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.501287 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.501313 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.501331 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:26Z","lastTransitionTime":"2025-12-16T14:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.609834 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.610033 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.610073 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.610116 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.610156 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:26Z","lastTransitionTime":"2025-12-16T14:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.714112 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.714205 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.714229 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.714301 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.714329 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:26Z","lastTransitionTime":"2025-12-16T14:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.817126 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.817192 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.817210 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.817306 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.817328 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:26Z","lastTransitionTime":"2025-12-16T14:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.920470 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.920534 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.920588 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.920615 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:26 crc kubenswrapper[4775]: I1216 14:55:26.920697 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:26Z","lastTransitionTime":"2025-12-16T14:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.023320 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.023365 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.023375 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.023392 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.023406 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:27Z","lastTransitionTime":"2025-12-16T14:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.126502 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.126545 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.126558 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.126575 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.126586 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:27Z","lastTransitionTime":"2025-12-16T14:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.229034 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.229110 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.229122 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.229139 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.229151 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:27Z","lastTransitionTime":"2025-12-16T14:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.332920 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.333020 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.333034 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.333083 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.333103 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:27Z","lastTransitionTime":"2025-12-16T14:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.337403 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:27 crc kubenswrapper[4775]: E1216 14:55:27.337637 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.436306 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.436353 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.436365 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.436381 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.436392 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:27Z","lastTransitionTime":"2025-12-16T14:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.525976 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.526201 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.526224 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.526247 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.526264 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:27Z","lastTransitionTime":"2025-12-16T14:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:27 crc kubenswrapper[4775]: E1216 14:55:27.546121 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.550711 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.550784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.550806 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.550833 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.550853 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:27Z","lastTransitionTime":"2025-12-16T14:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:27 crc kubenswrapper[4775]: E1216 14:55:27.569767 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.575394 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.575447 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.575463 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.575484 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.575500 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:27Z","lastTransitionTime":"2025-12-16T14:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:27 crc kubenswrapper[4775]: E1216 14:55:27.588380 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.592138 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.592183 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.592197 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.592218 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.592234 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:27Z","lastTransitionTime":"2025-12-16T14:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:27 crc kubenswrapper[4775]: E1216 14:55:27.606069 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.610256 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.610290 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.610301 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.610321 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.610334 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:27Z","lastTransitionTime":"2025-12-16T14:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:27 crc kubenswrapper[4775]: E1216 14:55:27.623585 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:27 crc kubenswrapper[4775]: E1216 14:55:27.623761 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.625366 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.625419 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.625430 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.625445 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.625454 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:27Z","lastTransitionTime":"2025-12-16T14:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.727745 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.727791 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.727803 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.727818 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.727827 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:27Z","lastTransitionTime":"2025-12-16T14:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.830716 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.830918 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.830942 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.830997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.831026 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:27Z","lastTransitionTime":"2025-12-16T14:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.933660 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.933741 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.933765 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.933795 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:27 crc kubenswrapper[4775]: I1216 14:55:27.933816 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:27Z","lastTransitionTime":"2025-12-16T14:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.037855 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.037968 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.037999 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.038031 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.038052 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:28Z","lastTransitionTime":"2025-12-16T14:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.141027 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.141085 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.141097 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.141114 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.141127 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:28Z","lastTransitionTime":"2025-12-16T14:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.244019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.244063 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.244079 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.244103 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.244120 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:28Z","lastTransitionTime":"2025-12-16T14:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.319130 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs\") pod \"network-metrics-daemon-c6mdt\" (UID: \"3d592ae8-792f-4cc5-9a32-b278deb33810\") " pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:28 crc kubenswrapper[4775]: E1216 14:55:28.319336 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:55:28 crc kubenswrapper[4775]: E1216 14:55:28.319448 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs podName:3d592ae8-792f-4cc5-9a32-b278deb33810 nodeName:}" failed. No retries permitted until 2025-12-16 14:55:44.319414614 +0000 UTC m=+69.270493577 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs") pod "network-metrics-daemon-c6mdt" (UID: "3d592ae8-792f-4cc5-9a32-b278deb33810") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.337198 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.337250 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:28 crc kubenswrapper[4775]: E1216 14:55:28.337375 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:28 crc kubenswrapper[4775]: E1216 14:55:28.337465 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.337710 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:28 crc kubenswrapper[4775]: E1216 14:55:28.338059 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.352002 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.352058 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.352082 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.352119 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.352138 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:28Z","lastTransitionTime":"2025-12-16T14:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.455501 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.455552 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.455567 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.455592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.455609 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:28Z","lastTransitionTime":"2025-12-16T14:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.511381 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.524018 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.535542 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.550974 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.558337 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.558416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.558440 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.558471 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.558494 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:28Z","lastTransitionTime":"2025-12-16T14:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.565978 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.580723 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.596187 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.608794 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.624364 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.639308 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.649077 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.660836 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.660883 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.660910 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.660924 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.660935 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:28Z","lastTransitionTime":"2025-12-16T14:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.664461 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.679767 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.702217 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:13Z\\\",\\\"message\\\":\\\"twork_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1216 14:55:13.334843 6322 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 14:55:13.334558 6322 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 14:55:13.334882 6322 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.716694 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.730978 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.746337 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.761096 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.763411 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.763455 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.763467 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.763482 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.763493 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:28Z","lastTransitionTime":"2025-12-16T14:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.775639 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.866841 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.866911 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.866925 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.866947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.866963 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:28Z","lastTransitionTime":"2025-12-16T14:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.969366 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.969419 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.969431 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.969450 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:28 crc kubenswrapper[4775]: I1216 14:55:28.969467 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:28Z","lastTransitionTime":"2025-12-16T14:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.071980 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.072567 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.072656 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.072750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.072844 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:29Z","lastTransitionTime":"2025-12-16T14:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.175780 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.175823 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.175839 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.175860 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.175876 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:29Z","lastTransitionTime":"2025-12-16T14:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.278665 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.278750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.278775 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.278800 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.278818 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:29Z","lastTransitionTime":"2025-12-16T14:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.337065 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:29 crc kubenswrapper[4775]: E1216 14:55:29.337242 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.382027 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.382110 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.382141 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.382168 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.382184 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:29Z","lastTransitionTime":"2025-12-16T14:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.484768 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.484825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.484836 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.484855 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.484867 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:29Z","lastTransitionTime":"2025-12-16T14:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.588694 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.588776 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.588796 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.588823 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.588845 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:29Z","lastTransitionTime":"2025-12-16T14:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.690653 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.690701 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.690710 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.690726 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.690735 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:29Z","lastTransitionTime":"2025-12-16T14:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.793670 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.793731 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.793747 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.793770 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.793787 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:29Z","lastTransitionTime":"2025-12-16T14:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.896892 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.896951 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.896963 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.896978 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:29 crc kubenswrapper[4775]: I1216 14:55:29.896988 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:29Z","lastTransitionTime":"2025-12-16T14:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.000033 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.000111 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.000128 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.000154 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.000173 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:30Z","lastTransitionTime":"2025-12-16T14:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.103629 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.103672 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.103682 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.103698 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.103708 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:30Z","lastTransitionTime":"2025-12-16T14:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.207354 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.207428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.207451 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.207478 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.207495 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:30Z","lastTransitionTime":"2025-12-16T14:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.310389 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.310430 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.310441 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.310460 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.310471 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:30Z","lastTransitionTime":"2025-12-16T14:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.337180 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.337251 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:30 crc kubenswrapper[4775]: E1216 14:55:30.337378 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:30 crc kubenswrapper[4775]: E1216 14:55:30.337552 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.337699 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:30 crc kubenswrapper[4775]: E1216 14:55:30.338055 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.414028 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.414061 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.414072 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.414086 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.414095 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:30Z","lastTransitionTime":"2025-12-16T14:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.517349 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.517413 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.517429 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.517450 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.517466 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:30Z","lastTransitionTime":"2025-12-16T14:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.621016 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.621055 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.621065 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.621082 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.621094 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:30Z","lastTransitionTime":"2025-12-16T14:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.724272 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.724320 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.724337 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.724357 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.724372 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:30Z","lastTransitionTime":"2025-12-16T14:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.827406 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.827489 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.827526 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.827594 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.827615 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:30Z","lastTransitionTime":"2025-12-16T14:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.930876 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.930978 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.931001 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.931032 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:30 crc kubenswrapper[4775]: I1216 14:55:30.931057 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:30Z","lastTransitionTime":"2025-12-16T14:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.033362 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.033437 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.033461 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.033491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.033511 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:31Z","lastTransitionTime":"2025-12-16T14:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.135637 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.135708 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.135730 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.135762 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.135784 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:31Z","lastTransitionTime":"2025-12-16T14:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.238755 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.238817 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.238832 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.238851 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.238863 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:31Z","lastTransitionTime":"2025-12-16T14:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.337560 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:31 crc kubenswrapper[4775]: E1216 14:55:31.337796 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.341491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.341527 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.341539 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.341559 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.341571 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:31Z","lastTransitionTime":"2025-12-16T14:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.444661 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.444814 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.444833 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.444858 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.444877 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:31Z","lastTransitionTime":"2025-12-16T14:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.548306 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.548378 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.548411 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.548431 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.548442 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:31Z","lastTransitionTime":"2025-12-16T14:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.651922 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.651969 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.651980 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.651997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.652018 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:31Z","lastTransitionTime":"2025-12-16T14:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.754181 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.754233 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.754247 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.754266 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.754279 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:31Z","lastTransitionTime":"2025-12-16T14:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.857103 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.857151 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.857164 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.857185 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.857199 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:31Z","lastTransitionTime":"2025-12-16T14:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.959785 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.959847 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.959861 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.959880 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:31 crc kubenswrapper[4775]: I1216 14:55:31.959917 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:31Z","lastTransitionTime":"2025-12-16T14:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.062764 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.062819 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.062836 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.062858 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.062877 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:32Z","lastTransitionTime":"2025-12-16T14:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.165488 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.165540 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.165556 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.165573 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.165583 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:32Z","lastTransitionTime":"2025-12-16T14:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.268270 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.268318 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.268329 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.268347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.268360 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:32Z","lastTransitionTime":"2025-12-16T14:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.337102 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.337159 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:32 crc kubenswrapper[4775]: E1216 14:55:32.337231 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:32 crc kubenswrapper[4775]: E1216 14:55:32.337318 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.337164 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:32 crc kubenswrapper[4775]: E1216 14:55:32.337457 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.370845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.370943 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.370963 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.370996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.371019 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:32Z","lastTransitionTime":"2025-12-16T14:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.474303 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.474365 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.474385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.474411 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.474429 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:32Z","lastTransitionTime":"2025-12-16T14:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.577222 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.577271 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.577292 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.577383 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.577397 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:32Z","lastTransitionTime":"2025-12-16T14:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.680822 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.680932 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.680952 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.680977 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.680993 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:32Z","lastTransitionTime":"2025-12-16T14:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.784035 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.784065 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.784073 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.784087 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.784095 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:32Z","lastTransitionTime":"2025-12-16T14:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.887213 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.887263 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.887283 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.887309 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.887331 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:32Z","lastTransitionTime":"2025-12-16T14:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.989255 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.989286 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.989295 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.989309 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:32 crc kubenswrapper[4775]: I1216 14:55:32.989318 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:32Z","lastTransitionTime":"2025-12-16T14:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.092986 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.093057 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.093070 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.093089 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.093104 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:33Z","lastTransitionTime":"2025-12-16T14:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.195630 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.195673 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.195684 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.195700 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.195710 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:33Z","lastTransitionTime":"2025-12-16T14:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.297845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.297914 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.297927 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.297947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.297963 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:33Z","lastTransitionTime":"2025-12-16T14:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.337835 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:33 crc kubenswrapper[4775]: E1216 14:55:33.338036 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.339023 4775 scope.go:117] "RemoveContainer" containerID="d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.400740 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.400791 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.400805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.400823 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.400839 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:33Z","lastTransitionTime":"2025-12-16T14:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.502943 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.502987 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.502998 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.503014 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.503025 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:33Z","lastTransitionTime":"2025-12-16T14:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.605969 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.606042 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.606065 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.606094 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.606119 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:33Z","lastTransitionTime":"2025-12-16T14:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.708979 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.709024 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.709037 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.709054 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.709068 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:33Z","lastTransitionTime":"2025-12-16T14:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.811551 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.811610 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.811630 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.811653 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.811669 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:33Z","lastTransitionTime":"2025-12-16T14:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.914122 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.914169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.914179 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.914196 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:33 crc kubenswrapper[4775]: I1216 14:55:33.914206 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:33Z","lastTransitionTime":"2025-12-16T14:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.019717 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.019767 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.019783 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.019804 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.019818 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:34Z","lastTransitionTime":"2025-12-16T14:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.122661 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.122944 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.122955 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.122968 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.122977 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:34Z","lastTransitionTime":"2025-12-16T14:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.225456 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.225493 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.225506 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.225522 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.225533 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:34Z","lastTransitionTime":"2025-12-16T14:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.328188 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.328221 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.328235 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.328251 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.328265 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:34Z","lastTransitionTime":"2025-12-16T14:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.337126 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:34 crc kubenswrapper[4775]: E1216 14:55:34.337251 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.337439 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:34 crc kubenswrapper[4775]: E1216 14:55:34.337509 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.337663 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:34 crc kubenswrapper[4775]: E1216 14:55:34.337745 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.430609 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.430655 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.430677 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.430697 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.430709 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:34Z","lastTransitionTime":"2025-12-16T14:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.533198 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.533226 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.533235 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.533249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.533257 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:34Z","lastTransitionTime":"2025-12-16T14:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.635417 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.635452 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.635464 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.635479 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.635489 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:34Z","lastTransitionTime":"2025-12-16T14:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.711661 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovnkube-controller/1.log" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.715259 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerStarted","Data":"2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f"} Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.716553 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.737612 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.737655 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.737666 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.737684 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.737695 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:34Z","lastTransitionTime":"2025-12-16T14:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.743840 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:34Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.757181 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c602406f-1aab-45b5-b815-41c4f89fa869\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131fe40257ce003285c74c2cc7160316851ec72690dd09901ec8b16468e0d107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdad84c13b928859836825f69d08d47815805b625941bb708e4057dfe754d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17fa2414d74d950bfd3e9631cdf0da6bc8b58f406d485d086d084d305ad5d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:34Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.772167 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:34Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.787241 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:34Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.801524 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:34Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.813477 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:34Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.830788 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:34Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.840263 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.840292 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.840301 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.840318 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.840332 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:34Z","lastTransitionTime":"2025-12-16T14:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.841560 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:34Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.852584 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:34Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.861015 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:34Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.870134 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:34Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.879015 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:34Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.890403 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:34Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.902131 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:34Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.910463 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:34Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.922334 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:34Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.932761 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:34Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.942855 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.942886 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.942910 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.942925 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.942946 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:34Z","lastTransitionTime":"2025-12-16T14:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:34 crc kubenswrapper[4775]: I1216 14:55:34.951110 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:13Z\\\",\\\"message\\\":\\\"twork_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1216 14:55:13.334843 6322 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 14:55:13.334558 6322 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 14:55:13.334882 6322 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:34Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.045931 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.045994 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.046013 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.046039 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.046059 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:35Z","lastTransitionTime":"2025-12-16T14:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.148524 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.148567 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.148574 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.148589 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.148599 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:35Z","lastTransitionTime":"2025-12-16T14:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.251858 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.251925 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.251935 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.251951 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.251962 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:35Z","lastTransitionTime":"2025-12-16T14:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.337293 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:35 crc kubenswrapper[4775]: E1216 14:55:35.337587 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.354428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.354460 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.354468 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.354480 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.354488 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:35Z","lastTransitionTime":"2025-12-16T14:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.356532 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.369636 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c602406f-1aab-45b5-b815-41c4f89fa869\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131fe40257ce003285c74c2cc7160316851ec72690dd09901ec8b16468e0d107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdad84c13b928859836825f69d08d47815805b625941bb708e4057dfe754d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17fa2414d74d950bfd3e9631cdf0da6bc8b58f406d485d086d084d305ad5d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.389416 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.408893 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.429242 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.446541 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.457125 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.457184 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.457201 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.457220 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.457233 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:35Z","lastTransitionTime":"2025-12-16T14:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.478369 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.497505 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.513442 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.529135 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.544084 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.561036 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.561090 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.561298 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.561326 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.561408 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.561526 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:35Z","lastTransitionTime":"2025-12-16T14:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.580835 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.603037 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.621949 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.650075 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.665487 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.665565 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.665584 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.665638 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.665658 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:35Z","lastTransitionTime":"2025-12-16T14:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.667160 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.697068 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:13Z\\\",\\\"message\\\":\\\"twork_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1216 14:55:13.334843 6322 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 14:55:13.334558 6322 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 14:55:13.334882 6322 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.721004 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovnkube-controller/2.log" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.721739 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovnkube-controller/1.log" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.724504 4775 generic.go:334] "Generic (PLEG): container finished" podID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerID="2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f" exitCode=1 Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.725915 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerDied","Data":"2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f"} Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.725970 4775 scope.go:117] "RemoveContainer" containerID="d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.726849 4775 scope.go:117] "RemoveContainer" containerID="2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f" Dec 16 14:55:35 crc kubenswrapper[4775]: E1216 14:55:35.727039 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.740157 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.755285 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.768352 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.769021 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.769112 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.769180 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.769241 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.769306 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:35Z","lastTransitionTime":"2025-12-16T14:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.779761 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.794990 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.809838 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.844090 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b0c89e33bd880932e00fb036b02119158dcf499dca43dc1956b952afc07a38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:13Z\\\",\\\"message\\\":\\\"twork_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1216 14:55:13.334843 6322 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 14:55:13.334558 6322 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 14:55:13.334882 6322 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:35Z\\\",\\\"message\\\":\\\"r removal\\\\nI1216 14:55:34.786527 6531 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:55:34.786545 6531 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:55:34.786568 6531 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:55:34.786591 6531 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:55:34.786611 6531 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:55:34.786620 6531 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1216 14:55:34.786627 6531 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1216 14:55:34.786633 6531 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 14:55:34.786646 6531 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1216 14:55:34.786646 6531 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 14:55:34.786663 6531 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 14:55:34.786977 6531 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:55:34.786992 6531 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:55:34.787015 6531 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:55:34.787044 6531 factory.go:656] Stopping watch factory\\\\nI1216 14:55:34.787061 6531 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.859498 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.870882 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c602406f-1aab-45b5-b815-41c4f89fa869\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131fe40257ce003285c74c2cc7160316851ec72690dd09901ec8b16468e0d107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdad84c13b928859836825f69d08d47815805b625941bb708e4057dfe754d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17fa2414d74d950bfd3e9631cdf0da6bc8b58f406d485d086d084d305ad5d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.871628 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.871693 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.871708 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.871729 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.871745 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:35Z","lastTransitionTime":"2025-12-16T14:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.883761 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.895416 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.907364 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.920132 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.944092 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.956430 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.969085 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.973533 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.973730 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.973804 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.973893 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.973980 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:35Z","lastTransitionTime":"2025-12-16T14:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.979704 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:35 crc kubenswrapper[4775]: I1216 14:55:35.989434 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.076638 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.076678 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.076689 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.076705 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.076717 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:36Z","lastTransitionTime":"2025-12-16T14:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.179213 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.179267 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.179280 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.179300 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.179313 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:36Z","lastTransitionTime":"2025-12-16T14:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.281733 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.281794 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.281814 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.281839 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.281859 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:36Z","lastTransitionTime":"2025-12-16T14:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.337467 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.337597 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:36 crc kubenswrapper[4775]: E1216 14:55:36.337603 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:36 crc kubenswrapper[4775]: E1216 14:55:36.337790 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.338021 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:36 crc kubenswrapper[4775]: E1216 14:55:36.338162 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.383696 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.383759 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.383769 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.383784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.383798 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:36Z","lastTransitionTime":"2025-12-16T14:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.486237 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.486340 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.486376 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.486420 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.486447 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:36Z","lastTransitionTime":"2025-12-16T14:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.589615 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.589690 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.589714 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.589745 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.589769 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:36Z","lastTransitionTime":"2025-12-16T14:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.693021 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.693068 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.693077 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.693122 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.693134 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:36Z","lastTransitionTime":"2025-12-16T14:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.731815 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovnkube-controller/2.log" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.737806 4775 scope.go:117] "RemoveContainer" containerID="2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f" Dec 16 14:55:36 crc kubenswrapper[4775]: E1216 14:55:36.738226 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.764580 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.780484 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.797742 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.797794 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.797813 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.797838 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.797856 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:36Z","lastTransitionTime":"2025-12-16T14:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.800797 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.813781 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.827930 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.843252 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.864799 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.880516 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.893339 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.900509 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.900548 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.900567 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.900590 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.900606 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:36Z","lastTransitionTime":"2025-12-16T14:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.910467 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.924857 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.957428 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:35Z\\\",\\\"message\\\":\\\"r removal\\\\nI1216 14:55:34.786527 6531 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:55:34.786545 6531 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:55:34.786568 6531 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:55:34.786591 6531 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:55:34.786611 6531 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:55:34.786620 6531 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1216 14:55:34.786627 6531 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1216 14:55:34.786633 6531 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 14:55:34.786646 6531 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1216 14:55:34.786646 6531 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 14:55:34.786663 6531 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 14:55:34.786977 6531 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:55:34.786992 6531 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:55:34.787015 6531 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:55:34.787044 6531 factory.go:656] Stopping watch factory\\\\nI1216 14:55:34.787061 6531 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.975082 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:36 crc kubenswrapper[4775]: I1216 14:55:36.990265 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.000297 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c602406f-1aab-45b5-b815-41c4f89fa869\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131fe40257ce003285c74c2cc7160316851ec72690dd09901ec8b16468e0d107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdad84c13b928859836825f69d08d47815805b625941bb708e4057dfe754d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17fa2414d74d950bfd3e9631cdf0da6bc8b58f406d485d086d084d305ad5d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.002939 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.002980 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.002996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.003015 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.003030 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:37Z","lastTransitionTime":"2025-12-16T14:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.016088 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.031748 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.046701 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.106041 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.106091 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.106103 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.106122 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.106139 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:37Z","lastTransitionTime":"2025-12-16T14:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.208634 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.208678 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.208691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.208712 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.208725 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:37Z","lastTransitionTime":"2025-12-16T14:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.311832 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.311948 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.311976 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.312006 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.312029 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:37Z","lastTransitionTime":"2025-12-16T14:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.337934 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:37 crc kubenswrapper[4775]: E1216 14:55:37.338125 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.414550 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.414617 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.414639 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.414711 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.414738 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:37Z","lastTransitionTime":"2025-12-16T14:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.517603 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.517653 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.517674 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.517720 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.517747 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:37Z","lastTransitionTime":"2025-12-16T14:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.621083 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.621140 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.621157 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.621178 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.621196 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:37Z","lastTransitionTime":"2025-12-16T14:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.723879 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.723982 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.724010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.724053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.724077 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:37Z","lastTransitionTime":"2025-12-16T14:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.826515 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.826564 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.826573 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.826589 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.826599 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:37Z","lastTransitionTime":"2025-12-16T14:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.929870 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.929934 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.929946 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.929970 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.929986 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:37Z","lastTransitionTime":"2025-12-16T14:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.990522 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.990570 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.990585 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.990604 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:37 crc kubenswrapper[4775]: I1216 14:55:37.990626 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:37Z","lastTransitionTime":"2025-12-16T14:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:38 crc kubenswrapper[4775]: E1216 14:55:38.011116 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.016171 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.016233 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.016254 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.016283 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.016302 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:38Z","lastTransitionTime":"2025-12-16T14:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:38 crc kubenswrapper[4775]: E1216 14:55:38.038416 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.042931 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.043026 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.043053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.043090 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.043117 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:38Z","lastTransitionTime":"2025-12-16T14:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:38 crc kubenswrapper[4775]: E1216 14:55:38.063562 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.068653 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.068749 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.068770 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.068819 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.068837 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:38Z","lastTransitionTime":"2025-12-16T14:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:38 crc kubenswrapper[4775]: E1216 14:55:38.087024 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.091181 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.091228 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.091238 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.091253 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.091264 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:38Z","lastTransitionTime":"2025-12-16T14:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:38 crc kubenswrapper[4775]: E1216 14:55:38.104706 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:38 crc kubenswrapper[4775]: E1216 14:55:38.105301 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.108100 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.108163 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.108187 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.108215 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.108238 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:38Z","lastTransitionTime":"2025-12-16T14:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.211018 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.211083 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.211097 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.211114 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.211127 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:38Z","lastTransitionTime":"2025-12-16T14:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.314217 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.314256 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.314268 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.314286 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.314299 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:38Z","lastTransitionTime":"2025-12-16T14:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.336953 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.337049 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:38 crc kubenswrapper[4775]: E1216 14:55:38.337119 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:38 crc kubenswrapper[4775]: E1216 14:55:38.337205 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.337050 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:38 crc kubenswrapper[4775]: E1216 14:55:38.337409 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.417036 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.417112 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.417143 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.417172 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.417191 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:38Z","lastTransitionTime":"2025-12-16T14:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.519342 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.519394 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.519409 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.519430 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.519444 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:38Z","lastTransitionTime":"2025-12-16T14:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.621084 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.621129 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.621141 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.621158 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.621174 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:38Z","lastTransitionTime":"2025-12-16T14:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.723435 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.723480 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.723494 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.723511 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.723522 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:38Z","lastTransitionTime":"2025-12-16T14:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.825928 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.825969 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.825983 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.825999 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.826010 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:38Z","lastTransitionTime":"2025-12-16T14:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.928752 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.928802 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.928810 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.928825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:38 crc kubenswrapper[4775]: I1216 14:55:38.928834 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:38Z","lastTransitionTime":"2025-12-16T14:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.031067 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.031099 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.031108 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.031121 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.031131 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:39Z","lastTransitionTime":"2025-12-16T14:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.132656 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.132691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.132699 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.132714 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.132722 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:39Z","lastTransitionTime":"2025-12-16T14:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.235506 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.235548 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.235563 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.235578 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.235589 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:39Z","lastTransitionTime":"2025-12-16T14:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.336992 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:39 crc kubenswrapper[4775]: E1216 14:55:39.337122 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.339259 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.339285 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.339294 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.339306 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.339315 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:39Z","lastTransitionTime":"2025-12-16T14:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.442064 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.442682 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.442748 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.442815 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.442897 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:39Z","lastTransitionTime":"2025-12-16T14:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.546643 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.547064 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.547210 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.547308 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.547379 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:39Z","lastTransitionTime":"2025-12-16T14:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.650515 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.650575 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.650590 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.650610 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.650623 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:39Z","lastTransitionTime":"2025-12-16T14:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.753034 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.753087 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.753099 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.753115 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.753130 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:39Z","lastTransitionTime":"2025-12-16T14:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.855774 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.855823 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.855837 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.855854 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.855866 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:39Z","lastTransitionTime":"2025-12-16T14:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.958351 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.958397 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.958408 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.958425 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:39 crc kubenswrapper[4775]: I1216 14:55:39.958435 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:39Z","lastTransitionTime":"2025-12-16T14:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.060977 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.061031 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.061041 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.061058 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.061070 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:40Z","lastTransitionTime":"2025-12-16T14:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.163361 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.163415 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.163427 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.163448 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.163461 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:40Z","lastTransitionTime":"2025-12-16T14:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.265798 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.265842 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.265855 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.265874 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.265908 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:40Z","lastTransitionTime":"2025-12-16T14:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.337609 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.337687 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:40 crc kubenswrapper[4775]: E1216 14:55:40.337789 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.337695 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:40 crc kubenswrapper[4775]: E1216 14:55:40.337871 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:40 crc kubenswrapper[4775]: E1216 14:55:40.338038 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.367715 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.367760 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.367770 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.367788 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.367800 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:40Z","lastTransitionTime":"2025-12-16T14:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.470165 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.470201 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.470213 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.470227 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.470238 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:40Z","lastTransitionTime":"2025-12-16T14:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.572793 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.572833 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.572844 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.572860 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.572873 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:40Z","lastTransitionTime":"2025-12-16T14:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.674986 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.675027 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.675037 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.675051 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.675061 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:40Z","lastTransitionTime":"2025-12-16T14:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.777090 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.777135 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.777166 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.777186 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.777200 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:40Z","lastTransitionTime":"2025-12-16T14:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.879579 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.879627 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.879639 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.879658 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.879670 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:40Z","lastTransitionTime":"2025-12-16T14:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.982419 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.982462 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.982471 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.982489 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:40 crc kubenswrapper[4775]: I1216 14:55:40.982499 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:40Z","lastTransitionTime":"2025-12-16T14:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.085207 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.085248 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.085261 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.085283 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.085295 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:41Z","lastTransitionTime":"2025-12-16T14:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.187592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.187646 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.187656 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.187673 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.187687 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:41Z","lastTransitionTime":"2025-12-16T14:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.289813 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.289912 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.289926 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.289947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.289960 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:41Z","lastTransitionTime":"2025-12-16T14:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.337595 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:41 crc kubenswrapper[4775]: E1216 14:55:41.337765 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.392433 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.392719 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.392854 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.392978 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.393071 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:41Z","lastTransitionTime":"2025-12-16T14:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.495195 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.495562 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.495679 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.495771 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.495838 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:41Z","lastTransitionTime":"2025-12-16T14:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.598152 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.598200 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.598212 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.598228 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.598241 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:41Z","lastTransitionTime":"2025-12-16T14:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.700912 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.700970 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.700982 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.701030 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.701049 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:41Z","lastTransitionTime":"2025-12-16T14:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.803522 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.803571 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.803582 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.803598 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.803609 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:41Z","lastTransitionTime":"2025-12-16T14:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.906437 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.906477 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.906487 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.906501 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:41 crc kubenswrapper[4775]: I1216 14:55:41.906511 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:41Z","lastTransitionTime":"2025-12-16T14:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.008753 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.008791 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.008800 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.008814 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.008825 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:42Z","lastTransitionTime":"2025-12-16T14:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.111231 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.111279 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.111291 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.111309 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.111324 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:42Z","lastTransitionTime":"2025-12-16T14:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.213723 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.213769 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.213778 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.213793 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.213803 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:42Z","lastTransitionTime":"2025-12-16T14:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.317009 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.317070 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.317082 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.317099 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.317110 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:42Z","lastTransitionTime":"2025-12-16T14:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.336772 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.336817 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:42 crc kubenswrapper[4775]: E1216 14:55:42.336924 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.337031 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:42 crc kubenswrapper[4775]: E1216 14:55:42.337065 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:42 crc kubenswrapper[4775]: E1216 14:55:42.337178 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.419524 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.419562 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.419572 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.419586 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.419594 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:42Z","lastTransitionTime":"2025-12-16T14:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.522078 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.522125 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.522135 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.522158 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.522171 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:42Z","lastTransitionTime":"2025-12-16T14:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.624757 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.624829 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.624849 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.624876 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.624922 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:42Z","lastTransitionTime":"2025-12-16T14:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.727736 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.728032 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.728132 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.728302 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.728384 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:42Z","lastTransitionTime":"2025-12-16T14:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.831223 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.831323 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.831343 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.831405 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.831423 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:42Z","lastTransitionTime":"2025-12-16T14:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.934288 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.934384 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.934401 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.934425 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:42 crc kubenswrapper[4775]: I1216 14:55:42.934440 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:42Z","lastTransitionTime":"2025-12-16T14:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.036424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.036469 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.036484 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.036505 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.036520 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:43Z","lastTransitionTime":"2025-12-16T14:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.138547 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.138859 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.138989 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.139061 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.139150 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:43Z","lastTransitionTime":"2025-12-16T14:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.241666 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.241710 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.241722 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.241737 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.241747 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:43Z","lastTransitionTime":"2025-12-16T14:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.338412 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:43 crc kubenswrapper[4775]: E1216 14:55:43.338565 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.343562 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.343592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.343601 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.343616 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.343626 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:43Z","lastTransitionTime":"2025-12-16T14:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.446345 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.446396 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.446413 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.446431 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.446443 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:43Z","lastTransitionTime":"2025-12-16T14:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.549237 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.549288 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.549297 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.549311 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.549321 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:43Z","lastTransitionTime":"2025-12-16T14:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.651345 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.651382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.651391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.651407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.651417 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:43Z","lastTransitionTime":"2025-12-16T14:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.754338 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.754732 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.754824 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.754938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.755046 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:43Z","lastTransitionTime":"2025-12-16T14:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.857102 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.857150 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.857159 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.857175 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.857188 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:43Z","lastTransitionTime":"2025-12-16T14:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.959939 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.960368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.960607 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.961356 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:43 crc kubenswrapper[4775]: I1216 14:55:43.961515 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:43Z","lastTransitionTime":"2025-12-16T14:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.064317 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.064353 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.064362 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.064401 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.064410 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:44Z","lastTransitionTime":"2025-12-16T14:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.166220 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.166508 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.166609 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.166679 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.166748 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:44Z","lastTransitionTime":"2025-12-16T14:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.271250 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.271306 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.271328 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.271420 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.271511 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:44Z","lastTransitionTime":"2025-12-16T14:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.337191 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.337305 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:44 crc kubenswrapper[4775]: E1216 14:55:44.337329 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.337376 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:44 crc kubenswrapper[4775]: E1216 14:55:44.337493 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:44 crc kubenswrapper[4775]: E1216 14:55:44.337557 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.347156 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.373902 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.373946 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.373956 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.373971 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.373981 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:44Z","lastTransitionTime":"2025-12-16T14:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.414767 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs\") pod \"network-metrics-daemon-c6mdt\" (UID: \"3d592ae8-792f-4cc5-9a32-b278deb33810\") " pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:44 crc kubenswrapper[4775]: E1216 14:55:44.414947 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:55:44 crc kubenswrapper[4775]: E1216 14:55:44.415005 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs podName:3d592ae8-792f-4cc5-9a32-b278deb33810 nodeName:}" failed. No retries permitted until 2025-12-16 14:56:16.414990079 +0000 UTC m=+101.366069002 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs") pod "network-metrics-daemon-c6mdt" (UID: "3d592ae8-792f-4cc5-9a32-b278deb33810") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.476476 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.476517 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.476530 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.476570 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.476584 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:44Z","lastTransitionTime":"2025-12-16T14:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.579220 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.579261 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.579272 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.579288 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.579301 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:44Z","lastTransitionTime":"2025-12-16T14:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.682193 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.682250 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.682272 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.682300 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.682320 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:44Z","lastTransitionTime":"2025-12-16T14:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.784994 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.785039 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.785049 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.785063 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.785075 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:44Z","lastTransitionTime":"2025-12-16T14:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.889065 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.889132 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.889156 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.889186 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.889209 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:44Z","lastTransitionTime":"2025-12-16T14:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.991670 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.991808 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.991903 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.991971 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:44 crc kubenswrapper[4775]: I1216 14:55:44.992028 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:44Z","lastTransitionTime":"2025-12-16T14:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.095104 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.095338 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.095395 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.095509 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.095575 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:45Z","lastTransitionTime":"2025-12-16T14:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.197654 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.197916 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.197988 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.198050 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.198151 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:45Z","lastTransitionTime":"2025-12-16T14:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.299997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.300049 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.300066 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.300090 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.300105 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:45Z","lastTransitionTime":"2025-12-16T14:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.336872 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:45 crc kubenswrapper[4775]: E1216 14:55:45.337078 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.352175 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.364528 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.383790 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:35Z\\\",\\\"message\\\":\\\"r removal\\\\nI1216 14:55:34.786527 6531 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:55:34.786545 6531 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:55:34.786568 6531 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:55:34.786591 6531 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:55:34.786611 6531 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:55:34.786620 6531 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1216 14:55:34.786627 6531 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1216 14:55:34.786633 6531 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 14:55:34.786646 6531 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1216 14:55:34.786646 6531 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 14:55:34.786663 6531 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 14:55:34.786977 6531 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:55:34.786992 6531 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:55:34.787015 6531 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:55:34.787044 6531 factory.go:656] Stopping watch factory\\\\nI1216 14:55:34.787061 6531 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.394201 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.406776 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.407710 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.407807 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.407869 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.408142 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.408229 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:45Z","lastTransitionTime":"2025-12-16T14:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.419229 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.429132 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.441151 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.453500 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.466059 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.477972 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c602406f-1aab-45b5-b815-41c4f89fa869\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131fe40257ce003285c74c2cc7160316851ec72690dd09901ec8b16468e0d107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdad84c13b928859836825f69d08d47815805b625941bb708e4057dfe754d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17fa2414d74d950bfd3e9631cdf0da6bc8b58f406d485d086d084d305ad5d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.492153 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.504640 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.510879 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.510963 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.510979 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.510997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.511024 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:45Z","lastTransitionTime":"2025-12-16T14:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.521343 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.531601 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.542604 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.553551 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d6e6e8b-5b01-4fea-af89-216b58eb98f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5303ea6c3b5cbada36b01c138cf3db28f57fa8d5974b2e35179aef3ee62e4ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.572664 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.583973 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.613828 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.613910 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.613929 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.613956 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.613975 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:45Z","lastTransitionTime":"2025-12-16T14:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.717020 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.717091 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.717109 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.717136 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.717155 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:45Z","lastTransitionTime":"2025-12-16T14:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.820051 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.820089 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.820098 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.820113 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.820121 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:45Z","lastTransitionTime":"2025-12-16T14:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.923114 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.923161 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.923171 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.923187 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:45 crc kubenswrapper[4775]: I1216 14:55:45.923198 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:45Z","lastTransitionTime":"2025-12-16T14:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.025576 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.025613 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.025622 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.025637 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.025648 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:46Z","lastTransitionTime":"2025-12-16T14:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.128080 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.128368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.128499 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.128587 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.128678 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:46Z","lastTransitionTime":"2025-12-16T14:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.231111 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.231177 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.231190 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.231205 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.231216 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:46Z","lastTransitionTime":"2025-12-16T14:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.333098 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.333440 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.333630 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.333813 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.334012 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:46Z","lastTransitionTime":"2025-12-16T14:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.337496 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.337511 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.337516 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:46 crc kubenswrapper[4775]: E1216 14:55:46.338123 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:46 crc kubenswrapper[4775]: E1216 14:55:46.338221 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:46 crc kubenswrapper[4775]: E1216 14:55:46.338268 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.437021 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.437082 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.437095 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.437111 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.437126 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:46Z","lastTransitionTime":"2025-12-16T14:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.539409 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.539472 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.539494 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.539525 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.539545 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:46Z","lastTransitionTime":"2025-12-16T14:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.642031 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.642073 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.642085 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.642100 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.642110 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:46Z","lastTransitionTime":"2025-12-16T14:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.744695 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.744731 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.744740 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.744754 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.744763 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:46Z","lastTransitionTime":"2025-12-16T14:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.765918 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc2lg_f108f76f-c79a-42b0-b5ac-714d49d9a4d5/kube-multus/0.log" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.765968 4775 generic.go:334] "Generic (PLEG): container finished" podID="f108f76f-c79a-42b0-b5ac-714d49d9a4d5" containerID="e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7" exitCode=1 Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.766003 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mc2lg" event={"ID":"f108f76f-c79a-42b0-b5ac-714d49d9a4d5","Type":"ContainerDied","Data":"e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7"} Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.766435 4775 scope.go:117] "RemoveContainer" containerID="e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.781219 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.793853 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:45Z\\\",\\\"message\\\":\\\"2025-12-16T14:54:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4c19a038-fc0a-4c89-bccc-fa72b8607d01\\\\n2025-12-16T14:54:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4c19a038-fc0a-4c89-bccc-fa72b8607d01 to /host/opt/cni/bin/\\\\n2025-12-16T14:54:59Z [verbose] multus-daemon started\\\\n2025-12-16T14:54:59Z [verbose] Readiness Indicator file check\\\\n2025-12-16T14:55:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.813673 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:35Z\\\",\\\"message\\\":\\\"r removal\\\\nI1216 14:55:34.786527 6531 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:55:34.786545 6531 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:55:34.786568 6531 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:55:34.786591 6531 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:55:34.786611 6531 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:55:34.786620 6531 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1216 14:55:34.786627 6531 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1216 14:55:34.786633 6531 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 14:55:34.786646 6531 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1216 14:55:34.786646 6531 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 14:55:34.786663 6531 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 14:55:34.786977 6531 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:55:34.786992 6531 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:55:34.787015 6531 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:55:34.787044 6531 factory.go:656] Stopping watch factory\\\\nI1216 14:55:34.787061 6531 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.826146 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.840119 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.849860 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.849918 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.849927 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.849941 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.849949 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:46Z","lastTransitionTime":"2025-12-16T14:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.854120 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.863924 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.881060 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.895951 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.908798 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.921298 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c602406f-1aab-45b5-b815-41c4f89fa869\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131fe40257ce003285c74c2cc7160316851ec72690dd09901ec8b16468e0d107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdad84c13b928859836825f69d08d47815805b625941bb708e4057dfe754d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17fa2414d74d950bfd3e9631cdf0da6bc8b58f406d485d086d084d305ad5d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.934516 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.944905 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.953549 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.953586 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.953597 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.953611 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.953621 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:46Z","lastTransitionTime":"2025-12-16T14:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.954576 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.962950 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.973419 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:46 crc kubenswrapper[4775]: I1216 14:55:46.982780 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d6e6e8b-5b01-4fea-af89-216b58eb98f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5303ea6c3b5cbada36b01c138cf3db28f57fa8d5974b2e35179aef3ee62e4ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.002582 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.015500 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.055918 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.055963 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.055982 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.056002 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.056015 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:47Z","lastTransitionTime":"2025-12-16T14:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.157836 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.157905 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.157918 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.157937 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.157948 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:47Z","lastTransitionTime":"2025-12-16T14:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.260380 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.260439 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.260461 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.260487 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.260507 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:47Z","lastTransitionTime":"2025-12-16T14:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.337274 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:47 crc kubenswrapper[4775]: E1216 14:55:47.337495 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.362771 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.362819 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.362831 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.362848 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.362861 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:47Z","lastTransitionTime":"2025-12-16T14:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.465648 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.465695 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.465711 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.465733 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.465751 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:47Z","lastTransitionTime":"2025-12-16T14:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.568821 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.568904 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.568921 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.568937 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.568949 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:47Z","lastTransitionTime":"2025-12-16T14:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.671321 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.671403 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.671435 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.671468 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.671490 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:47Z","lastTransitionTime":"2025-12-16T14:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.772766 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc2lg_f108f76f-c79a-42b0-b5ac-714d49d9a4d5/kube-multus/0.log" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.772914 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mc2lg" event={"ID":"f108f76f-c79a-42b0-b5ac-714d49d9a4d5","Type":"ContainerStarted","Data":"df66b9c818cf970df880bf19cf5d511f23a4ff7bebd59e241339dd26e0ac8fa0"} Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.773515 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.773569 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.773582 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.773600 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.773616 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:47Z","lastTransitionTime":"2025-12-16T14:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.790658 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df66b9c818cf970df880bf19cf5d511f23a4ff7bebd59e241339dd26e0ac8fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:45Z\\\",\\\"message\\\":\\\"2025-12-16T14:54:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4c19a038-fc0a-4c89-bccc-fa72b8607d01\\\\n2025-12-16T14:54:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4c19a038-fc0a-4c89-bccc-fa72b8607d01 to /host/opt/cni/bin/\\\\n2025-12-16T14:54:59Z [verbose] multus-daemon started\\\\n2025-12-16T14:54:59Z [verbose] Readiness Indicator file check\\\\n2025-12-16T14:55:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.820830 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:35Z\\\",\\\"message\\\":\\\"r removal\\\\nI1216 14:55:34.786527 6531 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:55:34.786545 6531 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:55:34.786568 6531 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:55:34.786591 6531 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:55:34.786611 6531 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:55:34.786620 6531 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1216 14:55:34.786627 6531 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1216 14:55:34.786633 6531 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 14:55:34.786646 6531 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1216 14:55:34.786646 6531 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 14:55:34.786663 6531 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 14:55:34.786977 6531 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:55:34.786992 6531 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:55:34.787015 6531 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:55:34.787044 6531 factory.go:656] Stopping watch factory\\\\nI1216 14:55:34.787061 6531 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.832500 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.844229 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.859030 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.869855 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.876274 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.876322 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.876334 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.876351 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.876364 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:47Z","lastTransitionTime":"2025-12-16T14:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.889936 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.905180 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.917757 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.930942 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c602406f-1aab-45b5-b815-41c4f89fa869\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131fe40257ce003285c74c2cc7160316851ec72690dd09901ec8b16468e0d107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdad84c13b928859836825f69d08d47815805b625941bb708e4057dfe754d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17fa2414d74d950bfd3e9631cdf0da6bc8b58f406d485d086d084d305ad5d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.946081 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.958780 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.969235 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.978373 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.979329 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.979354 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.979362 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.979376 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.979386 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:47Z","lastTransitionTime":"2025-12-16T14:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:47 crc kubenswrapper[4775]: I1216 14:55:47.989478 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.000307 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d6e6e8b-5b01-4fea-af89-216b58eb98f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5303ea6c3b5cbada36b01c138cf3db28f57fa8d5974b2e35179aef3ee62e4ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.021709 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.034932 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.046571 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.081856 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.081937 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.081950 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.081963 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.081972 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:48Z","lastTransitionTime":"2025-12-16T14:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.185219 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.185266 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.185277 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.185296 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.185309 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:48Z","lastTransitionTime":"2025-12-16T14:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.287482 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.287536 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.287552 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.287570 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.287582 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:48Z","lastTransitionTime":"2025-12-16T14:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.316374 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.316418 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.316426 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.316445 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.316454 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:48Z","lastTransitionTime":"2025-12-16T14:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:48 crc kubenswrapper[4775]: E1216 14:55:48.328946 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.332429 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.332476 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.332487 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.332504 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.332515 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:48Z","lastTransitionTime":"2025-12-16T14:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.337498 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.337552 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:48 crc kubenswrapper[4775]: E1216 14:55:48.337611 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:48 crc kubenswrapper[4775]: E1216 14:55:48.337670 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.337556 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:48 crc kubenswrapper[4775]: E1216 14:55:48.337821 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:48 crc kubenswrapper[4775]: E1216 14:55:48.345196 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.348395 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.348437 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.348447 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.348462 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.348472 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:48Z","lastTransitionTime":"2025-12-16T14:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:48 crc kubenswrapper[4775]: E1216 14:55:48.358988 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.361713 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.361768 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.361782 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.361801 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.361813 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:48Z","lastTransitionTime":"2025-12-16T14:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:48 crc kubenswrapper[4775]: E1216 14:55:48.373092 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.379201 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.379233 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.379242 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.379255 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.379284 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:48Z","lastTransitionTime":"2025-12-16T14:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:48 crc kubenswrapper[4775]: E1216 14:55:48.390858 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:48 crc kubenswrapper[4775]: E1216 14:55:48.391051 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.393047 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.393096 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.393108 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.393125 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.393137 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:48Z","lastTransitionTime":"2025-12-16T14:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.495908 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.495959 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.495970 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.495988 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.495999 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:48Z","lastTransitionTime":"2025-12-16T14:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.598434 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.598489 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.598509 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.598533 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.598550 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:48Z","lastTransitionTime":"2025-12-16T14:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.700966 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.701034 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.701054 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.701074 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.701087 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:48Z","lastTransitionTime":"2025-12-16T14:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.804007 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.804074 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.804090 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.804107 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.804121 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:48Z","lastTransitionTime":"2025-12-16T14:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.907656 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.907709 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.907719 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.907737 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:48 crc kubenswrapper[4775]: I1216 14:55:48.907747 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:48Z","lastTransitionTime":"2025-12-16T14:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.010491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.010543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.010555 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.010575 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.010589 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:49Z","lastTransitionTime":"2025-12-16T14:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.113988 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.114043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.114055 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.114091 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.114101 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:49Z","lastTransitionTime":"2025-12-16T14:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.217072 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.217118 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.217127 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.217143 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.217154 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:49Z","lastTransitionTime":"2025-12-16T14:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.319802 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.319877 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.319927 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.319954 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.319972 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:49Z","lastTransitionTime":"2025-12-16T14:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.337411 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:49 crc kubenswrapper[4775]: E1216 14:55:49.338068 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.338631 4775 scope.go:117] "RemoveContainer" containerID="2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f" Dec 16 14:55:49 crc kubenswrapper[4775]: E1216 14:55:49.338941 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.423642 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.423708 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.423728 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.423753 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.423773 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:49Z","lastTransitionTime":"2025-12-16T14:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.527683 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.527976 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.528041 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.528065 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.528120 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:49Z","lastTransitionTime":"2025-12-16T14:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.631091 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.631135 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.631147 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.631165 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.631177 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:49Z","lastTransitionTime":"2025-12-16T14:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.733279 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.733324 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.733336 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.733355 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.733367 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:49Z","lastTransitionTime":"2025-12-16T14:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.836917 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.836981 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.837000 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.837025 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.837042 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:49Z","lastTransitionTime":"2025-12-16T14:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.939883 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.939993 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.940013 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.940037 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:49 crc kubenswrapper[4775]: I1216 14:55:49.940053 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:49Z","lastTransitionTime":"2025-12-16T14:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.042566 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.042619 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.042633 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.042651 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.042670 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:50Z","lastTransitionTime":"2025-12-16T14:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.146032 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.146237 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.146273 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.146299 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.146318 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:50Z","lastTransitionTime":"2025-12-16T14:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.249722 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.249796 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.249830 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.249861 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.249928 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:50Z","lastTransitionTime":"2025-12-16T14:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.337404 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.337426 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:50 crc kubenswrapper[4775]: E1216 14:55:50.337561 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.337529 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:50 crc kubenswrapper[4775]: E1216 14:55:50.337666 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:50 crc kubenswrapper[4775]: E1216 14:55:50.337878 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.352112 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.352147 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.352156 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.352169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.352180 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:50Z","lastTransitionTime":"2025-12-16T14:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.455377 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.455424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.455435 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.455454 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.455467 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:50Z","lastTransitionTime":"2025-12-16T14:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.558081 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.558122 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.558131 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.558147 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.558159 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:50Z","lastTransitionTime":"2025-12-16T14:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.661232 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.661300 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.661315 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.661336 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.661350 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:50Z","lastTransitionTime":"2025-12-16T14:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.763821 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.763868 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.763900 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.763920 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.763931 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:50Z","lastTransitionTime":"2025-12-16T14:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.865592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.865639 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.865652 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.865668 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.865678 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:50Z","lastTransitionTime":"2025-12-16T14:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.969266 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.969363 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.969382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.969440 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:50 crc kubenswrapper[4775]: I1216 14:55:50.969457 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:50Z","lastTransitionTime":"2025-12-16T14:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.072515 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.072575 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.072585 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.072605 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.072617 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:51Z","lastTransitionTime":"2025-12-16T14:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.175218 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.175276 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.175288 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.175304 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.175316 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:51Z","lastTransitionTime":"2025-12-16T14:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.277596 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.277662 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.277683 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.277717 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.277738 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:51Z","lastTransitionTime":"2025-12-16T14:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.338389 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:51 crc kubenswrapper[4775]: E1216 14:55:51.338609 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.380608 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.380672 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.380684 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.380702 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.380714 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:51Z","lastTransitionTime":"2025-12-16T14:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.483017 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.483053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.483065 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.483083 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.483096 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:51Z","lastTransitionTime":"2025-12-16T14:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.585817 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.585980 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.586044 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.586078 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.586100 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:51Z","lastTransitionTime":"2025-12-16T14:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.689194 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.690094 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.690396 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.690524 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.690638 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:51Z","lastTransitionTime":"2025-12-16T14:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.793195 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.793254 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.793272 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.793294 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.793311 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:51Z","lastTransitionTime":"2025-12-16T14:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.896091 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.896134 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.896142 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.896156 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.896165 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:51Z","lastTransitionTime":"2025-12-16T14:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.998618 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.998697 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.998707 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.998739 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:51 crc kubenswrapper[4775]: I1216 14:55:51.998751 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:51Z","lastTransitionTime":"2025-12-16T14:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.101215 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.101261 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.101271 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.101286 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.101297 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:52Z","lastTransitionTime":"2025-12-16T14:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.203743 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.203779 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.203788 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.203819 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.203830 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:52Z","lastTransitionTime":"2025-12-16T14:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.307008 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.307057 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.307070 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.307091 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.307107 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:52Z","lastTransitionTime":"2025-12-16T14:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.337123 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.337208 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.337208 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:52 crc kubenswrapper[4775]: E1216 14:55:52.337680 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:52 crc kubenswrapper[4775]: E1216 14:55:52.337833 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:52 crc kubenswrapper[4775]: E1216 14:55:52.337868 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.409438 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.409505 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.409523 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.409548 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.409566 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:52Z","lastTransitionTime":"2025-12-16T14:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.512006 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.512049 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.512061 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.512080 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.512092 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:52Z","lastTransitionTime":"2025-12-16T14:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.615335 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.615416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.615439 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.615474 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.615497 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:52Z","lastTransitionTime":"2025-12-16T14:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.717865 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.717955 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.717977 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.718009 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.718033 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:52Z","lastTransitionTime":"2025-12-16T14:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.821117 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.821173 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.821190 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.821212 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.821230 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:52Z","lastTransitionTime":"2025-12-16T14:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.923343 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.923396 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.923413 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.923431 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:52 crc kubenswrapper[4775]: I1216 14:55:52.923467 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:52Z","lastTransitionTime":"2025-12-16T14:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.026236 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.026303 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.026329 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.026358 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.026379 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:53Z","lastTransitionTime":"2025-12-16T14:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.128790 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.128865 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.128919 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.128956 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.128980 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:53Z","lastTransitionTime":"2025-12-16T14:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.232118 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.232182 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.232256 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.232288 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.232311 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:53Z","lastTransitionTime":"2025-12-16T14:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.335679 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.335730 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.335738 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.335756 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.335771 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:53Z","lastTransitionTime":"2025-12-16T14:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.336999 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:53 crc kubenswrapper[4775]: E1216 14:55:53.337100 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.438485 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.438531 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.438542 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.438558 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.438568 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:53Z","lastTransitionTime":"2025-12-16T14:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.541506 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.541580 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.541599 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.541622 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.541639 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:53Z","lastTransitionTime":"2025-12-16T14:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.645475 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.645541 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.645578 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.645598 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.645620 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:53Z","lastTransitionTime":"2025-12-16T14:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.749365 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.749420 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.749438 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.749461 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.749477 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:53Z","lastTransitionTime":"2025-12-16T14:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.852703 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.853083 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.853589 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.854223 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.855115 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:53Z","lastTransitionTime":"2025-12-16T14:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.958562 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.958617 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.958629 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.958649 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:53 crc kubenswrapper[4775]: I1216 14:55:53.958661 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:53Z","lastTransitionTime":"2025-12-16T14:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.062081 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.062155 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.062178 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.062207 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.062229 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:54Z","lastTransitionTime":"2025-12-16T14:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.164773 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.165153 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.165258 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.165371 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.165489 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:54Z","lastTransitionTime":"2025-12-16T14:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.268708 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.268814 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.268869 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.268950 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.268974 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:54Z","lastTransitionTime":"2025-12-16T14:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.337130 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:54 crc kubenswrapper[4775]: E1216 14:55:54.337594 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.337968 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:54 crc kubenswrapper[4775]: E1216 14:55:54.338151 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.338248 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:54 crc kubenswrapper[4775]: E1216 14:55:54.338422 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.372281 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.372375 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.372403 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.372437 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.372460 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:54Z","lastTransitionTime":"2025-12-16T14:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.475466 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.475532 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.475555 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.475580 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.475599 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:54Z","lastTransitionTime":"2025-12-16T14:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.578605 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.578727 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.578739 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.578758 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.578771 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:54Z","lastTransitionTime":"2025-12-16T14:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.681365 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.681407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.681418 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.681436 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.681448 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:54Z","lastTransitionTime":"2025-12-16T14:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.784517 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.784603 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.784622 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.784649 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.784666 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:54Z","lastTransitionTime":"2025-12-16T14:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.887713 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.887759 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.887776 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.887799 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.887819 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:54Z","lastTransitionTime":"2025-12-16T14:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.990392 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.990448 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.990457 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.990475 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:54 crc kubenswrapper[4775]: I1216 14:55:54.990486 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:54Z","lastTransitionTime":"2025-12-16T14:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.094104 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.094624 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.094923 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.095172 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.095377 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:55Z","lastTransitionTime":"2025-12-16T14:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.197867 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.198428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.198496 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.198590 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.198685 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:55Z","lastTransitionTime":"2025-12-16T14:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.302378 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.302425 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.302434 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.302451 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.302461 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:55Z","lastTransitionTime":"2025-12-16T14:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.337291 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:55 crc kubenswrapper[4775]: E1216 14:55:55.337478 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.358504 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.373720 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.387212 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.408654 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.408737 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.408757 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.408807 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.408823 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:55Z","lastTransitionTime":"2025-12-16T14:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.410284 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.424936 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df66b9c818cf970df880bf19cf5d511f23a4ff7bebd59e241339dd26e0ac8fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:45Z\\\",\\\"message\\\":\\\"2025-12-16T14:54:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4c19a038-fc0a-4c89-bccc-fa72b8607d01\\\\n2025-12-16T14:54:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4c19a038-fc0a-4c89-bccc-fa72b8607d01 to /host/opt/cni/bin/\\\\n2025-12-16T14:54:59Z [verbose] multus-daemon started\\\\n2025-12-16T14:54:59Z [verbose] Readiness Indicator file check\\\\n2025-12-16T14:55:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.454858 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:35Z\\\",\\\"message\\\":\\\"r removal\\\\nI1216 14:55:34.786527 6531 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:55:34.786545 6531 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:55:34.786568 6531 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:55:34.786591 6531 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:55:34.786611 6531 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:55:34.786620 6531 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1216 14:55:34.786627 6531 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1216 14:55:34.786633 6531 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 14:55:34.786646 6531 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1216 14:55:34.786646 6531 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 14:55:34.786663 6531 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 14:55:34.786977 6531 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:55:34.786992 6531 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:55:34.787015 6531 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:55:34.787044 6531 factory.go:656] Stopping watch factory\\\\nI1216 14:55:34.787061 6531 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.470739 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.488542 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.505311 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c602406f-1aab-45b5-b815-41c4f89fa869\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131fe40257ce003285c74c2cc7160316851ec72690dd09901ec8b16468e0d107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdad84c13b928859836825f69d08d47815805b625941bb708e4057dfe754d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17fa2414d74d950bfd3e9631cdf0da6bc8b58f406d485d086d084d305ad5d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.510751 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.510791 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.510803 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.510820 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.510834 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:55Z","lastTransitionTime":"2025-12-16T14:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.517784 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.530216 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.541406 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.552033 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d6e6e8b-5b01-4fea-af89-216b58eb98f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5303ea6c3b5cbada36b01c138cf3db28f57fa8d5974b2e35179aef3ee62e4ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.578262 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.590663 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.603470 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.625930 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.633187 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.633240 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.633252 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.633268 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.633277 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:55Z","lastTransitionTime":"2025-12-16T14:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.663403 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.677192 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.735639 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.735684 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.735697 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.735717 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.735727 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:55Z","lastTransitionTime":"2025-12-16T14:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.838798 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.838847 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.838857 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.838873 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.838901 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:55Z","lastTransitionTime":"2025-12-16T14:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.941097 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.941146 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.941159 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.941180 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:55 crc kubenswrapper[4775]: I1216 14:55:55.941197 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:55Z","lastTransitionTime":"2025-12-16T14:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.044809 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.044926 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.044970 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.045002 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.045020 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:56Z","lastTransitionTime":"2025-12-16T14:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.147789 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.147880 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.147964 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.147998 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.148023 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:56Z","lastTransitionTime":"2025-12-16T14:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.250858 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.250966 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.250992 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.251021 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.251047 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:56Z","lastTransitionTime":"2025-12-16T14:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.337337 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.337434 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:56 crc kubenswrapper[4775]: E1216 14:55:56.337563 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.337434 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:56 crc kubenswrapper[4775]: E1216 14:55:56.337678 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:56 crc kubenswrapper[4775]: E1216 14:55:56.337811 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.354070 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.354132 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.354153 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.354178 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.354196 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:56Z","lastTransitionTime":"2025-12-16T14:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.457140 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.457201 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.457213 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.457234 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.457246 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:56Z","lastTransitionTime":"2025-12-16T14:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.560316 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.560368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.560388 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.560416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.560434 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:56Z","lastTransitionTime":"2025-12-16T14:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.663573 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.663626 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.663642 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.663670 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.663686 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:56Z","lastTransitionTime":"2025-12-16T14:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.766775 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.766825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.766842 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.766865 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.766882 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:56Z","lastTransitionTime":"2025-12-16T14:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.869399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.869479 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.869502 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.869532 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.869553 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:56Z","lastTransitionTime":"2025-12-16T14:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.971814 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.971859 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.971870 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.971908 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:56 crc kubenswrapper[4775]: I1216 14:55:56.971921 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:56Z","lastTransitionTime":"2025-12-16T14:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.074342 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.074399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.074411 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.074428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.074440 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:57Z","lastTransitionTime":"2025-12-16T14:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.177174 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.177210 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.177218 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.177232 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.177243 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:57Z","lastTransitionTime":"2025-12-16T14:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.279634 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.279682 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.279693 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.279711 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.279720 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:57Z","lastTransitionTime":"2025-12-16T14:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.337331 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:57 crc kubenswrapper[4775]: E1216 14:55:57.337525 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.382703 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.382765 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.382785 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.382810 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.382829 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:57Z","lastTransitionTime":"2025-12-16T14:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.486135 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.486197 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.486221 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.486257 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.486282 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:57Z","lastTransitionTime":"2025-12-16T14:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.589600 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.589658 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.589685 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.589711 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.589729 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:57Z","lastTransitionTime":"2025-12-16T14:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.692651 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.692784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.692807 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.692833 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.692851 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:57Z","lastTransitionTime":"2025-12-16T14:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.795671 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.795720 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.795739 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.795763 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.795778 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:57Z","lastTransitionTime":"2025-12-16T14:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.898425 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.898481 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.898497 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.898515 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:57 crc kubenswrapper[4775]: I1216 14:55:57.898527 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:57Z","lastTransitionTime":"2025-12-16T14:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.001741 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.001806 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.001826 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.001855 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.001873 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:58Z","lastTransitionTime":"2025-12-16T14:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.104564 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.104618 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.104635 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.104660 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.104679 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:58Z","lastTransitionTime":"2025-12-16T14:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.166187 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.166328 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:02.166302368 +0000 UTC m=+147.117381301 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.166385 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.166462 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.166546 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.166592 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.166622 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:02.166604648 +0000 UTC m=+147.117683581 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.166642 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:02.166629889 +0000 UTC m=+147.117708822 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.207535 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.207607 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.207623 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.207640 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.207651 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:58Z","lastTransitionTime":"2025-12-16T14:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.267645 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.267732 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.267849 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.267877 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.267919 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.267926 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.267936 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.267949 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.268004 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:02.2679856 +0000 UTC m=+147.219064523 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.268043 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:02.268014441 +0000 UTC m=+147.219093404 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.310848 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.310978 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.310998 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.311021 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.311038 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:58Z","lastTransitionTime":"2025-12-16T14:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.337735 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.337765 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.337765 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.337877 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.338000 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.338201 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.413914 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.413952 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.413963 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.413979 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.413990 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:58Z","lastTransitionTime":"2025-12-16T14:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.517182 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.517273 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.517291 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.517313 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.517333 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:58Z","lastTransitionTime":"2025-12-16T14:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.611569 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.611624 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.611635 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.611655 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.611667 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:58Z","lastTransitionTime":"2025-12-16T14:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.626754 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.632049 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.632101 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.632113 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.632132 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.632144 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:58Z","lastTransitionTime":"2025-12-16T14:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.648291 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.653032 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.653097 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.653106 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.653120 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.653155 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:58Z","lastTransitionTime":"2025-12-16T14:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.663991 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.668488 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.668556 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.668567 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.668583 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.668596 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:58Z","lastTransitionTime":"2025-12-16T14:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.679971 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.683262 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.683338 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.683351 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.683375 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.683387 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:58Z","lastTransitionTime":"2025-12-16T14:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.698808 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:55:58Z is after 2025-08-24T17:21:41Z" Dec 16 14:55:58 crc kubenswrapper[4775]: E1216 14:55:58.698996 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.700451 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.700504 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.700579 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.700611 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.700626 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:58Z","lastTransitionTime":"2025-12-16T14:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.803683 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.803762 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.803781 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.803809 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.803828 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:58Z","lastTransitionTime":"2025-12-16T14:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.907704 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.907750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.907766 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.907789 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:58 crc kubenswrapper[4775]: I1216 14:55:58.907805 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:58Z","lastTransitionTime":"2025-12-16T14:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.011514 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.011566 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.011583 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.011608 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.011624 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:59Z","lastTransitionTime":"2025-12-16T14:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.114326 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.114372 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.114385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.114402 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.114415 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:59Z","lastTransitionTime":"2025-12-16T14:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.216373 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.216421 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.216433 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.216450 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.216464 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:59Z","lastTransitionTime":"2025-12-16T14:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.320029 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.320099 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.320120 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.320148 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.320169 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:59Z","lastTransitionTime":"2025-12-16T14:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.337945 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:55:59 crc kubenswrapper[4775]: E1216 14:55:59.338167 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.423489 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.423539 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.423558 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.423581 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.423601 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:59Z","lastTransitionTime":"2025-12-16T14:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.527132 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.527238 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.527255 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.527279 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.527296 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:59Z","lastTransitionTime":"2025-12-16T14:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.629763 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.629831 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.629849 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.629877 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.629965 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:59Z","lastTransitionTime":"2025-12-16T14:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.734308 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.734354 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.734367 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.734385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.734400 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:59Z","lastTransitionTime":"2025-12-16T14:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.837622 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.837684 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.837703 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.837729 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.837747 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:59Z","lastTransitionTime":"2025-12-16T14:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.940981 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.941077 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.941100 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.941622 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:55:59 crc kubenswrapper[4775]: I1216 14:55:59.941976 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:55:59Z","lastTransitionTime":"2025-12-16T14:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.045062 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.045117 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.045140 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.045160 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.045174 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:00Z","lastTransitionTime":"2025-12-16T14:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.148403 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.148445 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.148457 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.148474 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.148487 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:00Z","lastTransitionTime":"2025-12-16T14:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.251229 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.251301 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.251319 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.251343 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.251361 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:00Z","lastTransitionTime":"2025-12-16T14:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.337056 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.337171 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:00 crc kubenswrapper[4775]: E1216 14:56:00.337228 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.337290 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:00 crc kubenswrapper[4775]: E1216 14:56:00.337474 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:00 crc kubenswrapper[4775]: E1216 14:56:00.338165 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.338748 4775 scope.go:117] "RemoveContainer" containerID="2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.353938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.354002 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.354016 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.354039 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.354051 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:00Z","lastTransitionTime":"2025-12-16T14:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.457253 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.457682 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.457976 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.458217 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.458426 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:00Z","lastTransitionTime":"2025-12-16T14:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.560546 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.560619 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.560636 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.560662 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.560683 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:00Z","lastTransitionTime":"2025-12-16T14:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.663333 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.663619 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.663684 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.663750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.663824 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:00Z","lastTransitionTime":"2025-12-16T14:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.767366 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.767422 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.767434 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.767453 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.767466 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:00Z","lastTransitionTime":"2025-12-16T14:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.871161 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.871527 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.871686 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.871825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.872029 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:00Z","lastTransitionTime":"2025-12-16T14:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.975230 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.975284 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.975301 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.975324 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:00 crc kubenswrapper[4775]: I1216 14:56:00.975341 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:00Z","lastTransitionTime":"2025-12-16T14:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.078384 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.078444 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.078463 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.078487 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.078505 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:01Z","lastTransitionTime":"2025-12-16T14:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.181304 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.181348 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.181357 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.181370 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.181382 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:01Z","lastTransitionTime":"2025-12-16T14:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.285269 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.285350 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.285369 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.285393 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.285412 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:01Z","lastTransitionTime":"2025-12-16T14:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.337379 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:01 crc kubenswrapper[4775]: E1216 14:56:01.337594 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.389071 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.389146 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.389168 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.389197 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.389219 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:01Z","lastTransitionTime":"2025-12-16T14:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.492201 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.492240 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.492251 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.492269 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.492283 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:01Z","lastTransitionTime":"2025-12-16T14:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.596127 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.596197 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.596208 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.596225 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.596239 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:01Z","lastTransitionTime":"2025-12-16T14:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.699265 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.699305 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.699317 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.699332 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.699344 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:01Z","lastTransitionTime":"2025-12-16T14:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.802143 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.802182 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.802195 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.802215 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.802272 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:01Z","lastTransitionTime":"2025-12-16T14:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.842265 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovnkube-controller/2.log" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.844683 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerStarted","Data":"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee"} Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.845396 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.874334 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.887567 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.903443 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.904661 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.904696 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.904705 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.904719 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.904729 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:01Z","lastTransitionTime":"2025-12-16T14:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.912445 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d6e6e8b-5b01-4fea-af89-216b58eb98f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5303ea6c3b5cbada36b01c138cf3db28f57fa8d5974b2e35179aef3ee62e4ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.930396 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.944870 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.957318 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.975415 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df66b9c818cf970df880bf19cf5d511f23a4ff7bebd59e241339dd26e0ac8fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:45Z\\\",\\\"message\\\":\\\"2025-12-16T14:54:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4c19a038-fc0a-4c89-bccc-fa72b8607d01\\\\n2025-12-16T14:54:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4c19a038-fc0a-4c89-bccc-fa72b8607d01 to /host/opt/cni/bin/\\\\n2025-12-16T14:54:59Z [verbose] multus-daemon started\\\\n2025-12-16T14:54:59Z [verbose] Readiness Indicator file check\\\\n2025-12-16T14:55:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:01 crc kubenswrapper[4775]: I1216 14:56:01.993633 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:35Z\\\",\\\"message\\\":\\\"r removal\\\\nI1216 14:55:34.786527 6531 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:55:34.786545 6531 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:55:34.786568 6531 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:55:34.786591 6531 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:55:34.786611 6531 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:55:34.786620 6531 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1216 14:55:34.786627 6531 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1216 14:55:34.786633 6531 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 14:55:34.786646 6531 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1216 14:55:34.786646 6531 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 14:55:34.786663 6531 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 14:55:34.786977 6531 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:55:34.786992 6531 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:55:34.787015 6531 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:55:34.787044 6531 factory.go:656] Stopping watch factory\\\\nI1216 14:55:34.787061 6531 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.006999 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.007033 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.007044 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.007077 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.007089 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:02Z","lastTransitionTime":"2025-12-16T14:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.011472 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.028406 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.054316 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.065584 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.079423 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.091803 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.102538 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.109954 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.110007 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.110019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.110038 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.110050 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:02Z","lastTransitionTime":"2025-12-16T14:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.113714 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c602406f-1aab-45b5-b815-41c4f89fa869\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131fe40257ce003285c74c2cc7160316851ec72690dd09901ec8b16468e0d107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdad84c13b928859836825f69d08d47815805b625941bb708e4057dfe754d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17fa2414d74d950bfd3e9631cdf0da6bc8b58f406d485d086d084d305ad5d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.127830 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.141063 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.211981 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.212017 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.212027 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.212041 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.212049 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:02Z","lastTransitionTime":"2025-12-16T14:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.315181 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.315248 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.315261 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.315276 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.315291 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:02Z","lastTransitionTime":"2025-12-16T14:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.336975 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.337018 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.337046 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:02 crc kubenswrapper[4775]: E1216 14:56:02.337188 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:02 crc kubenswrapper[4775]: E1216 14:56:02.337262 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:02 crc kubenswrapper[4775]: E1216 14:56:02.337337 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.418594 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.418720 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.418798 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.418825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.418872 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:02Z","lastTransitionTime":"2025-12-16T14:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.521720 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.521793 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.521810 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.521832 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.521850 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:02Z","lastTransitionTime":"2025-12-16T14:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.625781 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.625866 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.625933 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.625971 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.625994 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:02Z","lastTransitionTime":"2025-12-16T14:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.729019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.729066 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.729081 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.729101 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.729115 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:02Z","lastTransitionTime":"2025-12-16T14:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.832121 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.832501 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.832512 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.832530 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.832540 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:02Z","lastTransitionTime":"2025-12-16T14:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.851327 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovnkube-controller/3.log" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.852301 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovnkube-controller/2.log" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.856027 4775 generic.go:334] "Generic (PLEG): container finished" podID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerID="cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee" exitCode=1 Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.856106 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerDied","Data":"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee"} Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.856199 4775 scope.go:117] "RemoveContainer" containerID="2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.856863 4775 scope.go:117] "RemoveContainer" containerID="cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee" Dec 16 14:56:02 crc kubenswrapper[4775]: E1216 14:56:02.862456 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.892722 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2f9b76c521297f1ddbe9f0ed1dd85ba4f9e994b694dbee73d465c21677501f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:35Z\\\",\\\"message\\\":\\\"r removal\\\\nI1216 14:55:34.786527 6531 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:55:34.786545 6531 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:55:34.786568 6531 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:55:34.786591 6531 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:55:34.786611 6531 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:55:34.786620 6531 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1216 14:55:34.786627 6531 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1216 14:55:34.786633 6531 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 14:55:34.786646 6531 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1216 14:55:34.786646 6531 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 14:55:34.786663 6531 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 14:55:34.786977 6531 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:55:34.786992 6531 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:55:34.787015 6531 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:55:34.787044 6531 factory.go:656] Stopping watch factory\\\\nI1216 14:55:34.787061 6531 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:56:02Z\\\",\\\"message\\\":\\\" 6942 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 14:56:02.268022 6942 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 14:56:02.268036 6942 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 14:56:02.268061 6942 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 14:56:02.268078 6942 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:56:02.268084 6942 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:56:02.268091 6942 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:56:02.268116 6942 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:56:02.268125 6942 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:56:02.268143 6942 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:56:02.268151 6942 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:56:02.268168 6942 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:56:02.268175 6942 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:56:02.268188 6942 factory.go:656] Stopping watch factory\\\\nI1216 14:56:02.268207 6942 ovnkube.go:599] Stopped ovnkube\\\\nI1216 14:56:02.268222 6942 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 14:56:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.905816 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.920403 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.934530 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.934578 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.934591 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.934608 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.934620 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:02Z","lastTransitionTime":"2025-12-16T14:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.936561 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.950108 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.968073 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:02 crc kubenswrapper[4775]: I1216 14:56:02.985533 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df66b9c818cf970df880bf19cf5d511f23a4ff7bebd59e241339dd26e0ac8fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:45Z\\\",\\\"message\\\":\\\"2025-12-16T14:54:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4c19a038-fc0a-4c89-bccc-fa72b8607d01\\\\n2025-12-16T14:54:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4c19a038-fc0a-4c89-bccc-fa72b8607d01 to /host/opt/cni/bin/\\\\n2025-12-16T14:54:59Z [verbose] multus-daemon started\\\\n2025-12-16T14:54:59Z [verbose] Readiness Indicator file check\\\\n2025-12-16T14:55:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.003825 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.020674 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c602406f-1aab-45b5-b815-41c4f89fa869\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131fe40257ce003285c74c2cc7160316851ec72690dd09901ec8b16468e0d107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdad84c13b928859836825f69d08d47815805b625941bb708e4057dfe754d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17fa2414d74d950bfd3e9631cdf0da6bc8b58f406d485d086d084d305ad5d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.034784 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.037173 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.037202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.037212 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.037247 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.037260 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:03Z","lastTransitionTime":"2025-12-16T14:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.047190 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.060493 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.074231 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.089626 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.105247 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d6e6e8b-5b01-4fea-af89-216b58eb98f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5303ea6c3b5cbada36b01c138cf3db28f57fa8d5974b2e35179aef3ee62e4ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.134457 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.139219 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.139382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.139492 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.139594 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.139699 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:03Z","lastTransitionTime":"2025-12-16T14:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.152728 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.168403 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.182756 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.243045 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.243103 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.243122 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.243146 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.243163 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:03Z","lastTransitionTime":"2025-12-16T14:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.338242 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:03 crc kubenswrapper[4775]: E1216 14:56:03.338389 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.345329 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.345366 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.345380 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.345396 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.345406 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:03Z","lastTransitionTime":"2025-12-16T14:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.448136 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.448536 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.448612 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.448681 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.448762 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:03Z","lastTransitionTime":"2025-12-16T14:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.551678 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.552077 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.552225 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.552413 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.552550 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:03Z","lastTransitionTime":"2025-12-16T14:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.655363 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.655702 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.655791 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.655915 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.656029 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:03Z","lastTransitionTime":"2025-12-16T14:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.758686 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.759035 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.759151 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.759245 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.759316 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:03Z","lastTransitionTime":"2025-12-16T14:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.860985 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovnkube-controller/3.log" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.861065 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.861087 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.861097 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.861112 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.861121 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:03Z","lastTransitionTime":"2025-12-16T14:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.864827 4775 scope.go:117] "RemoveContainer" containerID="cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee" Dec 16 14:56:03 crc kubenswrapper[4775]: E1216 14:56:03.864981 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.879442 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.893272 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.904773 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.918427 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.929791 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d6e6e8b-5b01-4fea-af89-216b58eb98f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5303ea6c3b5cbada36b01c138cf3db28f57fa8d5974b2e35179aef3ee62e4ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.954628 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.968826 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.968898 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.968913 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.968931 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.968944 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:03Z","lastTransitionTime":"2025-12-16T14:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.977355 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:03 crc kubenswrapper[4775]: I1216 14:56:03.997455 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.015204 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df66b9c818cf970df880bf19cf5d511f23a4ff7bebd59e241339dd26e0ac8fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:45Z\\\",\\\"message\\\":\\\"2025-12-16T14:54:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4c19a038-fc0a-4c89-bccc-fa72b8607d01\\\\n2025-12-16T14:54:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4c19a038-fc0a-4c89-bccc-fa72b8607d01 to /host/opt/cni/bin/\\\\n2025-12-16T14:54:59Z [verbose] multus-daemon started\\\\n2025-12-16T14:54:59Z [verbose] Readiness Indicator file check\\\\n2025-12-16T14:55:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.037397 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:56:02Z\\\",\\\"message\\\":\\\" 6942 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 14:56:02.268022 6942 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 14:56:02.268036 6942 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 14:56:02.268061 6942 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 14:56:02.268078 6942 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:56:02.268084 6942 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:56:02.268091 6942 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:56:02.268116 6942 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:56:02.268125 6942 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:56:02.268143 6942 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:56:02.268151 6942 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:56:02.268168 6942 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:56:02.268175 6942 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:56:02.268188 6942 factory.go:656] Stopping watch factory\\\\nI1216 14:56:02.268207 6942 ovnkube.go:599] Stopped ovnkube\\\\nI1216 14:56:02.268222 6942 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 14:56:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:56:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.050715 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.065414 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.071504 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.071560 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.071574 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.071597 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.071615 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:04Z","lastTransitionTime":"2025-12-16T14:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.081929 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.096223 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.111700 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.126866 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.143188 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.160050 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c602406f-1aab-45b5-b815-41c4f89fa869\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131fe40257ce003285c74c2cc7160316851ec72690dd09901ec8b16468e0d107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdad84c13b928859836825f69d08d47815805b625941bb708e4057dfe754d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17fa2414d74d950bfd3e9631cdf0da6bc8b58f406d485d086d084d305ad5d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.174606 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.174656 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.174670 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.174691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.174704 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:04Z","lastTransitionTime":"2025-12-16T14:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.178688 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:04Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.277577 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.277638 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.277651 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.277673 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.277686 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:04Z","lastTransitionTime":"2025-12-16T14:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.337444 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.337507 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:04 crc kubenswrapper[4775]: E1216 14:56:04.338036 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.337596 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:04 crc kubenswrapper[4775]: E1216 14:56:04.338354 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:04 crc kubenswrapper[4775]: E1216 14:56:04.338111 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.381214 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.381272 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.381289 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.381316 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.381334 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:04Z","lastTransitionTime":"2025-12-16T14:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.484463 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.484526 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.484539 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.484562 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.484576 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:04Z","lastTransitionTime":"2025-12-16T14:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.587044 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.587098 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.587111 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.587131 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.587145 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:04Z","lastTransitionTime":"2025-12-16T14:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.689233 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.689289 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.689305 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.689329 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.689346 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:04Z","lastTransitionTime":"2025-12-16T14:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.792541 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.792861 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.793011 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.793111 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.793196 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:04Z","lastTransitionTime":"2025-12-16T14:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.896942 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.897036 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.897054 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.897108 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:04 crc kubenswrapper[4775]: I1216 14:56:04.897130 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:04Z","lastTransitionTime":"2025-12-16T14:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.001136 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.001240 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.001263 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.001456 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.001495 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:05Z","lastTransitionTime":"2025-12-16T14:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.104773 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.105101 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.105213 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.105313 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.105413 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:05Z","lastTransitionTime":"2025-12-16T14:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.209372 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.209453 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.209470 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.209496 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.209554 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:05Z","lastTransitionTime":"2025-12-16T14:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.312999 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.313046 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.313062 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.313087 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.313103 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:05Z","lastTransitionTime":"2025-12-16T14:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.337050 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:05 crc kubenswrapper[4775]: E1216 14:56:05.337206 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.360737 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.378216 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df66b9c818cf970df880bf19cf5d511f23a4ff7bebd59e241339dd26e0ac8fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:45Z\\\",\\\"message\\\":\\\"2025-12-16T14:54:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4c19a038-fc0a-4c89-bccc-fa72b8607d01\\\\n2025-12-16T14:54:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4c19a038-fc0a-4c89-bccc-fa72b8607d01 to /host/opt/cni/bin/\\\\n2025-12-16T14:54:59Z [verbose] multus-daemon started\\\\n2025-12-16T14:54:59Z [verbose] Readiness Indicator file check\\\\n2025-12-16T14:55:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.409578 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:56:02Z\\\",\\\"message\\\":\\\" 6942 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 14:56:02.268022 6942 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 14:56:02.268036 6942 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 14:56:02.268061 6942 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 14:56:02.268078 6942 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:56:02.268084 6942 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:56:02.268091 6942 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:56:02.268116 6942 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:56:02.268125 6942 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:56:02.268143 6942 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:56:02.268151 6942 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:56:02.268168 6942 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:56:02.268175 6942 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:56:02.268188 6942 factory.go:656] Stopping watch factory\\\\nI1216 14:56:02.268207 6942 ovnkube.go:599] Stopped ovnkube\\\\nI1216 14:56:02.268222 6942 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 14:56:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:56:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.415915 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.416113 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.416209 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.416298 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.416385 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:05Z","lastTransitionTime":"2025-12-16T14:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.425335 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.440254 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.460354 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.474765 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.491548 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.506992 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.519297 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.519759 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.519871 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.520025 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.520135 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:05Z","lastTransitionTime":"2025-12-16T14:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.524987 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.539562 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c602406f-1aab-45b5-b815-41c4f89fa869\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131fe40257ce003285c74c2cc7160316851ec72690dd09901ec8b16468e0d107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdad84c13b928859836825f69d08d47815805b625941bb708e4057dfe754d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17fa2414d74d950bfd3e9631cdf0da6bc8b58f406d485d086d084d305ad5d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.554410 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.568747 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.582628 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.592407 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.602321 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.611369 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d6e6e8b-5b01-4fea-af89-216b58eb98f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5303ea6c3b5cbada36b01c138cf3db28f57fa8d5974b2e35179aef3ee62e4ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.623049 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.623078 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.623085 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.623119 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.623131 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:05Z","lastTransitionTime":"2025-12-16T14:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.627332 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.638935 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:05Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.726241 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.726295 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.726313 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.726338 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.726357 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:05Z","lastTransitionTime":"2025-12-16T14:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.828802 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.829256 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.829388 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.829493 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.829601 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:05Z","lastTransitionTime":"2025-12-16T14:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.931965 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.932032 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.932055 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.932090 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:05 crc kubenswrapper[4775]: I1216 14:56:05.932114 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:05Z","lastTransitionTime":"2025-12-16T14:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.034842 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.034946 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.034971 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.035003 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.035024 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:06Z","lastTransitionTime":"2025-12-16T14:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.138083 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.138145 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.138156 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.138172 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.138184 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:06Z","lastTransitionTime":"2025-12-16T14:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.241275 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.241337 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.241357 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.241381 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.241400 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:06Z","lastTransitionTime":"2025-12-16T14:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.337343 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.337390 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.337503 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:06 crc kubenswrapper[4775]: E1216 14:56:06.337627 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:06 crc kubenswrapper[4775]: E1216 14:56:06.337788 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:06 crc kubenswrapper[4775]: E1216 14:56:06.337869 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.349202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.349252 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.349264 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.349280 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.349337 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:06Z","lastTransitionTime":"2025-12-16T14:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.451784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.451829 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.451840 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.451855 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.451866 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:06Z","lastTransitionTime":"2025-12-16T14:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.554575 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.554771 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.554812 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.554844 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.554870 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:06Z","lastTransitionTime":"2025-12-16T14:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.658703 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.658826 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.658990 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.659034 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.659058 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:06Z","lastTransitionTime":"2025-12-16T14:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.762252 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.762661 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.762810 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.762989 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.763154 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:06Z","lastTransitionTime":"2025-12-16T14:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.867051 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.867420 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.867617 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.867835 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.868086 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:06Z","lastTransitionTime":"2025-12-16T14:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.970523 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.970859 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.970955 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.971035 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:06 crc kubenswrapper[4775]: I1216 14:56:06.971100 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:06Z","lastTransitionTime":"2025-12-16T14:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.074143 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.074199 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.074214 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.074235 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.074250 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:07Z","lastTransitionTime":"2025-12-16T14:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.177480 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.177560 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.177579 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.177602 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.177616 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:07Z","lastTransitionTime":"2025-12-16T14:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.280844 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.280922 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.280934 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.280949 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.280958 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:07Z","lastTransitionTime":"2025-12-16T14:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.337851 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:07 crc kubenswrapper[4775]: E1216 14:56:07.338100 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.383478 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.383543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.383560 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.383586 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.383603 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:07Z","lastTransitionTime":"2025-12-16T14:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.486234 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.486521 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.486603 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.486704 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.486792 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:07Z","lastTransitionTime":"2025-12-16T14:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.589443 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.589506 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.589524 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.589547 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.589566 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:07Z","lastTransitionTime":"2025-12-16T14:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.692748 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.692788 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.692799 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.692814 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.692824 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:07Z","lastTransitionTime":"2025-12-16T14:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.794669 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.794968 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.795115 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.795215 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.795303 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:07Z","lastTransitionTime":"2025-12-16T14:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.898467 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.898777 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.898955 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.899079 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:07 crc kubenswrapper[4775]: I1216 14:56:07.899176 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:07Z","lastTransitionTime":"2025-12-16T14:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.002642 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.002683 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.002696 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.002713 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.002725 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:08Z","lastTransitionTime":"2025-12-16T14:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.105380 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.105414 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.105423 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.105438 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.105451 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:08Z","lastTransitionTime":"2025-12-16T14:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.208861 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.208915 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.208925 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.208939 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.208948 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:08Z","lastTransitionTime":"2025-12-16T14:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.312040 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.312095 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.312112 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.312136 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.312153 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:08Z","lastTransitionTime":"2025-12-16T14:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.337634 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.337725 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.337722 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:08 crc kubenswrapper[4775]: E1216 14:56:08.337835 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:08 crc kubenswrapper[4775]: E1216 14:56:08.338036 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:08 crc kubenswrapper[4775]: E1216 14:56:08.338072 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.414305 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.414337 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.414347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.414366 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.414380 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:08Z","lastTransitionTime":"2025-12-16T14:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.516352 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.516395 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.516407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.516425 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.516438 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:08Z","lastTransitionTime":"2025-12-16T14:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.619117 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.619157 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.619169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.619186 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.619196 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:08Z","lastTransitionTime":"2025-12-16T14:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.722163 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.722233 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.722255 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.722285 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.722306 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:08Z","lastTransitionTime":"2025-12-16T14:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.813461 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.813507 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.813518 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.813532 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.813547 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:08Z","lastTransitionTime":"2025-12-16T14:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:08 crc kubenswrapper[4775]: E1216 14:56:08.834780 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.839520 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.839631 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.839655 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.839683 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.839704 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:08Z","lastTransitionTime":"2025-12-16T14:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:08 crc kubenswrapper[4775]: E1216 14:56:08.855555 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.861043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.861380 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.861573 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.861770 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.862034 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:08Z","lastTransitionTime":"2025-12-16T14:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:08 crc kubenswrapper[4775]: E1216 14:56:08.883238 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.887173 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.887218 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.887229 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.887245 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.887256 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:08Z","lastTransitionTime":"2025-12-16T14:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:08 crc kubenswrapper[4775]: E1216 14:56:08.902095 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.905557 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.905585 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.905598 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.905614 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.905625 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:08Z","lastTransitionTime":"2025-12-16T14:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:08 crc kubenswrapper[4775]: E1216 14:56:08.916545 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:08 crc kubenswrapper[4775]: E1216 14:56:08.916671 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.918057 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.918088 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.918179 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.918197 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:08 crc kubenswrapper[4775]: I1216 14:56:08.918208 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:08Z","lastTransitionTime":"2025-12-16T14:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.020391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.020450 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.020461 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.020476 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.020486 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:09Z","lastTransitionTime":"2025-12-16T14:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.123287 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.123334 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.123345 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.123365 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.123376 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:09Z","lastTransitionTime":"2025-12-16T14:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.226436 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.226505 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.226530 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.226559 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.226579 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:09Z","lastTransitionTime":"2025-12-16T14:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.329793 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.329878 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.329927 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.329954 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.329972 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:09Z","lastTransitionTime":"2025-12-16T14:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.337399 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:09 crc kubenswrapper[4775]: E1216 14:56:09.337629 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.432106 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.432154 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.432168 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.432186 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.432199 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:09Z","lastTransitionTime":"2025-12-16T14:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.534402 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.534435 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.534446 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.534459 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.534470 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:09Z","lastTransitionTime":"2025-12-16T14:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.637380 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.637425 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.637437 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.637455 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.637469 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:09Z","lastTransitionTime":"2025-12-16T14:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.740154 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.740199 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.740211 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.740227 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.740238 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:09Z","lastTransitionTime":"2025-12-16T14:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.842772 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.842842 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.842864 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.842924 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.842950 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:09Z","lastTransitionTime":"2025-12-16T14:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.945001 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.945039 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.945048 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.945061 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:09 crc kubenswrapper[4775]: I1216 14:56:09.945069 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:09Z","lastTransitionTime":"2025-12-16T14:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.048425 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.048498 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.048517 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.048545 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.048567 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:10Z","lastTransitionTime":"2025-12-16T14:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.151392 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.151457 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.151477 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.151502 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.151525 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:10Z","lastTransitionTime":"2025-12-16T14:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.255031 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.255100 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.255115 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.255144 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.255163 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:10Z","lastTransitionTime":"2025-12-16T14:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.337409 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:10 crc kubenswrapper[4775]: E1216 14:56:10.337638 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.337777 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:10 crc kubenswrapper[4775]: E1216 14:56:10.338066 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.338082 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:10 crc kubenswrapper[4775]: E1216 14:56:10.338352 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.358018 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.358095 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.358118 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.358148 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.358168 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:10Z","lastTransitionTime":"2025-12-16T14:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.461484 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.461538 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.461551 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.461574 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.461589 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:10Z","lastTransitionTime":"2025-12-16T14:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.563587 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.563658 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.563676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.563701 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.563721 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:10Z","lastTransitionTime":"2025-12-16T14:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.666361 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.666417 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.666433 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.666454 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.666469 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:10Z","lastTransitionTime":"2025-12-16T14:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.769756 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.770078 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.770110 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.770130 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.770141 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:10Z","lastTransitionTime":"2025-12-16T14:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.873559 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.873990 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.874013 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.874042 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.874064 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:10Z","lastTransitionTime":"2025-12-16T14:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.976387 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.976419 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.976431 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.976447 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:10 crc kubenswrapper[4775]: I1216 14:56:10.976457 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:10Z","lastTransitionTime":"2025-12-16T14:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.079528 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.079837 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.079941 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.080222 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.080296 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:11Z","lastTransitionTime":"2025-12-16T14:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.182773 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.182846 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.182864 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.182948 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.182982 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:11Z","lastTransitionTime":"2025-12-16T14:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.285588 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.285954 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.286051 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.286157 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.286265 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:11Z","lastTransitionTime":"2025-12-16T14:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.337451 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:11 crc kubenswrapper[4775]: E1216 14:56:11.337659 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.388593 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.388788 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.388853 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.388933 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.389007 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:11Z","lastTransitionTime":"2025-12-16T14:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.491421 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.491532 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.491601 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.491673 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.491746 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:11Z","lastTransitionTime":"2025-12-16T14:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.595399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.595768 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.595870 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.596002 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.596087 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:11Z","lastTransitionTime":"2025-12-16T14:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.699446 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.700037 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.700510 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.700767 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.701031 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:11Z","lastTransitionTime":"2025-12-16T14:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.804522 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.804592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.804617 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.804648 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.804671 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:11Z","lastTransitionTime":"2025-12-16T14:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.907483 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.907534 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.907546 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.907583 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:11 crc kubenswrapper[4775]: I1216 14:56:11.907595 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:11Z","lastTransitionTime":"2025-12-16T14:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.010976 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.011067 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.011086 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.011116 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.011135 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:12Z","lastTransitionTime":"2025-12-16T14:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.114672 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.114725 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.114742 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.114765 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.114786 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:12Z","lastTransitionTime":"2025-12-16T14:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.218093 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.219057 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.219099 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.219123 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.219140 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:12Z","lastTransitionTime":"2025-12-16T14:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.323091 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.323162 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.323174 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.323193 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.323206 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:12Z","lastTransitionTime":"2025-12-16T14:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.337153 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.337173 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:12 crc kubenswrapper[4775]: E1216 14:56:12.337287 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.337158 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:12 crc kubenswrapper[4775]: E1216 14:56:12.337492 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:12 crc kubenswrapper[4775]: E1216 14:56:12.337557 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.426370 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.426419 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.426430 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.426448 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.426460 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:12Z","lastTransitionTime":"2025-12-16T14:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.529932 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.530000 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.530012 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.530028 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.530039 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:12Z","lastTransitionTime":"2025-12-16T14:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.632671 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.632715 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.632727 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.632747 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.632760 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:12Z","lastTransitionTime":"2025-12-16T14:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.735695 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.735736 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.735746 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.735763 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.735773 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:12Z","lastTransitionTime":"2025-12-16T14:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.838528 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.838863 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.839041 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.839101 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.839117 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:12Z","lastTransitionTime":"2025-12-16T14:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.941873 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.941963 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.941986 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.942010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:12 crc kubenswrapper[4775]: I1216 14:56:12.942026 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:12Z","lastTransitionTime":"2025-12-16T14:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.045339 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.045392 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.045410 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.045443 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.045462 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:13Z","lastTransitionTime":"2025-12-16T14:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.147997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.148036 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.148048 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.148065 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.148078 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:13Z","lastTransitionTime":"2025-12-16T14:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.251150 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.251774 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.251852 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.251969 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.252053 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:13Z","lastTransitionTime":"2025-12-16T14:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.337191 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:13 crc kubenswrapper[4775]: E1216 14:56:13.337396 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.353823 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.353853 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.353861 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.353872 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.353881 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:13Z","lastTransitionTime":"2025-12-16T14:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.455351 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.455396 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.455407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.455423 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.455433 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:13Z","lastTransitionTime":"2025-12-16T14:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.558165 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.558233 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.558254 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.558285 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.558305 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:13Z","lastTransitionTime":"2025-12-16T14:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.661082 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.661133 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.661150 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.661171 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.661187 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:13Z","lastTransitionTime":"2025-12-16T14:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.763554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.763608 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.763623 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.763644 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.763671 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:13Z","lastTransitionTime":"2025-12-16T14:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.866159 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.866214 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.866226 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.866246 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.866266 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:13Z","lastTransitionTime":"2025-12-16T14:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.968712 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.969121 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.969171 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.969203 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:13 crc kubenswrapper[4775]: I1216 14:56:13.969225 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:13Z","lastTransitionTime":"2025-12-16T14:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.072388 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.072439 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.072452 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.072468 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.072478 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:14Z","lastTransitionTime":"2025-12-16T14:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.174789 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.175157 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.175272 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.175399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.175495 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:14Z","lastTransitionTime":"2025-12-16T14:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.278297 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.278347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.278358 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.278374 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.278385 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:14Z","lastTransitionTime":"2025-12-16T14:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.336837 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:14 crc kubenswrapper[4775]: E1216 14:56:14.337342 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.336972 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:14 crc kubenswrapper[4775]: E1216 14:56:14.337664 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.336925 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:14 crc kubenswrapper[4775]: E1216 14:56:14.337972 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.381028 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.381086 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.381106 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.381127 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.381142 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:14Z","lastTransitionTime":"2025-12-16T14:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.483497 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.483543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.483555 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.483569 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.483583 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:14Z","lastTransitionTime":"2025-12-16T14:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.586154 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.586429 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.586634 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.586851 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.587082 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:14Z","lastTransitionTime":"2025-12-16T14:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.689507 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.689556 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.689566 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.689584 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.689595 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:14Z","lastTransitionTime":"2025-12-16T14:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.792551 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.792595 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.792605 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.792620 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:14 crc kubenswrapper[4775]: I1216 14:56:14.792631 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:14Z","lastTransitionTime":"2025-12-16T14:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.342764 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:15 crc kubenswrapper[4775]: E1216 14:56:15.342963 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.357555 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-47t7r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef8da9e-565b-40c0-a37d-f4f44c552912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9f852bb319faeca4d44ab9292fe9077d213bab3e1fa46bf902baad2e4d0ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-47t7r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.371730 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06c229d1-beab-4662-96c5-e458d6cd3e83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9f6a43c2d1c35bf67aeb8f3deafe61221bdfee86e8a73b04f7b96c7daa700e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a6e5f62229033d4f67f9b49a2b4f15a7b08a20e401d6d665b585eab5adc45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9cs7c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jv9gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.384749 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d6e6e8b-5b01-4fea-af89-216b58eb98f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5303ea6c3b5cbada36b01c138cf3db28f57fa8d5974b2e35179aef3ee62e4ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2b89224fb36239369b3a91b2a74b744326cfb97f7295a7b5d9319461981318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.417832 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c730de93-b5c2-44ab-a257-dd4b051b6491\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff894e260184af58c3de087bf7d0da679d50e486b07a662e285eaf6d3f7a9f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e23cc06d41d06d661fd3b3f1d5cff5a12ed7e1dec078410a557eb46e8058347f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c77e8874286ba151b48d773306a8531abaa8cbe97de7b9fb5d87ab243c5f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebe46864c55164962a79a60e104db6647611e3c6f90abe1fa33a33583a34046f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047d8e22f49db1229a1609ee4ed8da00e3508293fb3d29eb3a4951d0248182dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecab44dd145cadc6592fb52804d6a5513ac72b3b0663de6745679e8ec5f400b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d8dfb991673791f072c053c126520e14ff3e6843f3808952232d601362bb25f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a68d9362cc096b91f1ab9f551bfed65251d2d3a1196189cae614dd4eed7ff6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.431530 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.443247 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82abda4fdab9ba7ee8758446af2dd5daf0815971d54cb0af0f82c7e836f8bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.453309 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584613dc-ef95-4911-9a79-76e805e1d4d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f18e7cd35c741178daa3b690bb777cda6f7399868926a860f7005c1e6f8c26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x482d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lh6xh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.474722 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"524488dd-74ee-43ea-ac0f-5e04d59af434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:56:02Z\\\",\\\"message\\\":\\\" 6942 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 14:56:02.268022 6942 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 14:56:02.268036 6942 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 14:56:02.268061 6942 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 14:56:02.268078 6942 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:56:02.268084 6942 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:56:02.268091 6942 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:56:02.268116 6942 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:56:02.268125 6942 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:56:02.268143 6942 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:56:02.268151 6942 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:56:02.268168 6942 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:56:02.268175 6942 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:56:02.268188 6942 factory.go:656] Stopping watch factory\\\\nI1216 14:56:02.268207 6942 ovnkube.go:599] Stopped ovnkube\\\\nI1216 14:56:02.268222 6942 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 14:56:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:56:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gcjwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-79w7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.485409 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d592ae8-792f-4cc5-9a32-b278deb33810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrwzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c6mdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.498990 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:54:53.669426 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:54:53.669596 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:54:53.670709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2759783901/tls.crt::/tmp/serving-cert-2759783901/tls.key\\\\\\\"\\\\nI1216 14:54:54.081992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:54:54.084385 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:54:54.084405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:54:54.084426 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:54:54.084447 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:54:54.089442 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 14:54:54.090987 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:54:54.091428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091458 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:54:54.091468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:54:54.091474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:54:54.091480 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:54:54.091486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 14:54:54.091675 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.512939 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.523345 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f2p7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7cfa7b1-7467-4f2f-b0aa-c12e5b0e92b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ae4307e78a3254ad6290444a061dc21b09a35e31aa074aed3b185795954184\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6tkgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:57Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f2p7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.536905 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hftd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f516c5-1af7-40c9-b8e2-2ce5386dce33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c25a08f017b7394065eb2294e5ecc2cbb8bfac83ee5be5e641b63dc7e00d899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cda4b54c41b5e66a59fab963767e4da5b2e94df42efb216c328517108c89500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989c54b48633c6f2304cee6c3f9050500ec99a83cd444ed59d4aa8bbdb1fcd1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60349a395d9777bc2cc57680792f46c47f34e73151f8a3545d46d1638fa0d724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7f6307b82ac659476f06f8cef09c29abe6a6c8dc17b55c5421fdacfeacb02f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12aa4cd282a3ca676c65325e5c81147b3e5008cf0bbf3cc8bed67e52a381d159\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://916cdb2a0d2d386e21781d85fff82f2e4ccb73a76fd212c412dea8d64535bcb1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxz7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hftd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.553016 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mc2lg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f108f76f-c79a-42b0-b5ac-714d49d9a4d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df66b9c818cf970df880bf19cf5d511f23a4ff7bebd59e241339dd26e0ac8fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:55:45Z\\\",\\\"message\\\":\\\"2025-12-16T14:54:59+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4c19a038-fc0a-4c89-bccc-fa72b8607d01\\\\n2025-12-16T14:54:59+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4c19a038-fc0a-4c89-bccc-fa72b8607d01 to /host/opt/cni/bin/\\\\n2025-12-16T14:54:59Z [verbose] multus-daemon started\\\\n2025-12-16T14:54:59Z [verbose] Readiness Indicator file check\\\\n2025-12-16T14:55:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j57ld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mc2lg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.566434 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38131824-0aa5-4809-b875-225963f805a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eebd8fa6905571c09884c167785ff4f5f5febff7e1b04a01a63496a2f080ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4802629283d2ae2100ed26bddc87aa0363e92d6d616b69c6dda25c241a553a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5376695feac0d9910c197a72cc773ac06211667601654477f1f9d1c043ed1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.577989 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c602406f-1aab-45b5-b815-41c4f89fa869\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131fe40257ce003285c74c2cc7160316851ec72690dd09901ec8b16468e0d107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbdad84c13b928859836825f69d08d47815805b625941bb708e4057dfe754d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17fa2414d74d950bfd3e9631cdf0da6bc8b58f406d485d086d084d305ad5d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c06f77f734f43f54162577251dbbd7fc19a8ebcc64cd44b2c49a1520461f255\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:54:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:54:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.591766 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a763d95d0035d2b853c1030391549cd409b5f017e18c294b4834d1c7975cf64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.603920 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.619672 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:54:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5973cbe7afd565f3cb5b2f8b4eb719374323f50fb01d5a905970aa1843b961d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e182d9ea20a03562a74841a01b0f3c870f72e00e0c5d203b7ab02bb67673690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:54:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.726166 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.726200 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.726214 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.726237 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.726250 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:15Z","lastTransitionTime":"2025-12-16T14:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.829004 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.829068 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.829085 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.829108 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.829126 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:15Z","lastTransitionTime":"2025-12-16T14:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.932011 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.932075 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.932100 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.932132 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:15 crc kubenswrapper[4775]: I1216 14:56:15.932155 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:15Z","lastTransitionTime":"2025-12-16T14:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.036168 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.036210 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.036221 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.036236 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.036246 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:16Z","lastTransitionTime":"2025-12-16T14:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.139450 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.139523 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.139542 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.139567 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.139586 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:16Z","lastTransitionTime":"2025-12-16T14:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.242875 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.242974 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.242992 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.243018 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.243038 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:16Z","lastTransitionTime":"2025-12-16T14:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.337598 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.337769 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.337838 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:16 crc kubenswrapper[4775]: E1216 14:56:16.337941 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:16 crc kubenswrapper[4775]: E1216 14:56:16.337930 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:16 crc kubenswrapper[4775]: E1216 14:56:16.338098 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.345330 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.345379 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.345395 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.345416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.345432 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:16Z","lastTransitionTime":"2025-12-16T14:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.449161 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.449254 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.449281 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.449310 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.449328 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:16Z","lastTransitionTime":"2025-12-16T14:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.473586 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs\") pod \"network-metrics-daemon-c6mdt\" (UID: \"3d592ae8-792f-4cc5-9a32-b278deb33810\") " pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:16 crc kubenswrapper[4775]: E1216 14:56:16.473785 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:56:16 crc kubenswrapper[4775]: E1216 14:56:16.473950 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs podName:3d592ae8-792f-4cc5-9a32-b278deb33810 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.47386946 +0000 UTC m=+165.424948423 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs") pod "network-metrics-daemon-c6mdt" (UID: "3d592ae8-792f-4cc5-9a32-b278deb33810") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.553038 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.553106 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.553145 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.553182 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.553207 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:16Z","lastTransitionTime":"2025-12-16T14:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.655472 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.655511 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.655522 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.655538 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.655550 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:16Z","lastTransitionTime":"2025-12-16T14:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.758953 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.759245 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.759372 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.759474 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.759550 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:16Z","lastTransitionTime":"2025-12-16T14:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.862999 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.863043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.863055 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.863075 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.863088 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:16Z","lastTransitionTime":"2025-12-16T14:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.966301 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.966370 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.966388 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.966415 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:16 crc kubenswrapper[4775]: I1216 14:56:16.966438 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:16Z","lastTransitionTime":"2025-12-16T14:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.068802 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.069178 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.069322 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.069454 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.069595 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:17Z","lastTransitionTime":"2025-12-16T14:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.173231 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.173292 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.173305 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.173327 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.173344 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:17Z","lastTransitionTime":"2025-12-16T14:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.276899 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.276946 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.276959 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.276979 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.276993 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:17Z","lastTransitionTime":"2025-12-16T14:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.337207 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:17 crc kubenswrapper[4775]: E1216 14:56:17.337432 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.379618 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.379679 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.379698 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.379721 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.379741 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:17Z","lastTransitionTime":"2025-12-16T14:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.483300 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.483680 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.483767 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.483937 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.484108 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:17Z","lastTransitionTime":"2025-12-16T14:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.587120 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.587171 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.587184 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.587202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.587218 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:17Z","lastTransitionTime":"2025-12-16T14:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.689756 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.689857 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.689877 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.689932 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.689952 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:17Z","lastTransitionTime":"2025-12-16T14:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.792592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.792655 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.792666 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.792684 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.792697 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:17Z","lastTransitionTime":"2025-12-16T14:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.896076 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.896159 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.896173 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.896198 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.896220 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:17Z","lastTransitionTime":"2025-12-16T14:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.998935 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.998972 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.998981 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.998997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:17 crc kubenswrapper[4775]: I1216 14:56:17.999008 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:17Z","lastTransitionTime":"2025-12-16T14:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.101370 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.101412 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.101424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.101442 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.101454 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:18Z","lastTransitionTime":"2025-12-16T14:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.203853 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.203932 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.203945 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.203962 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.203975 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:18Z","lastTransitionTime":"2025-12-16T14:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.306939 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.306983 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.306995 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.307010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.307022 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:18Z","lastTransitionTime":"2025-12-16T14:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.337631 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.337702 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.337817 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:18 crc kubenswrapper[4775]: E1216 14:56:18.338033 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:18 crc kubenswrapper[4775]: E1216 14:56:18.338604 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:18 crc kubenswrapper[4775]: E1216 14:56:18.338749 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.339085 4775 scope.go:117] "RemoveContainer" containerID="cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee" Dec 16 14:56:18 crc kubenswrapper[4775]: E1216 14:56:18.339365 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.410397 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.410442 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.410452 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.410467 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.410480 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:18Z","lastTransitionTime":"2025-12-16T14:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.512791 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.512833 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.512845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.512862 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.512876 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:18Z","lastTransitionTime":"2025-12-16T14:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.615944 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.615985 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.615998 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.616021 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.616036 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:18Z","lastTransitionTime":"2025-12-16T14:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.718548 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.718630 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.718650 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.718687 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.718723 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:18Z","lastTransitionTime":"2025-12-16T14:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.822030 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.822089 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.822099 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.822119 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.822133 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:18Z","lastTransitionTime":"2025-12-16T14:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.923799 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.923836 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.923846 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.923862 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:18 crc kubenswrapper[4775]: I1216 14:56:18.923873 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:18Z","lastTransitionTime":"2025-12-16T14:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.022126 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.022176 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.022189 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.022207 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.022221 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:19Z","lastTransitionTime":"2025-12-16T14:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:19 crc kubenswrapper[4775]: E1216 14:56:19.042717 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.048400 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.048448 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.048463 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.048482 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.048495 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:19Z","lastTransitionTime":"2025-12-16T14:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:19 crc kubenswrapper[4775]: E1216 14:56:19.064138 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.068318 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.068346 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.068357 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.068372 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.068383 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:19Z","lastTransitionTime":"2025-12-16T14:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:19 crc kubenswrapper[4775]: E1216 14:56:19.086287 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.090428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.090463 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.090474 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.090490 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.090502 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:19Z","lastTransitionTime":"2025-12-16T14:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:19 crc kubenswrapper[4775]: E1216 14:56:19.106277 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.110320 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.110382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.110404 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.110430 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.110448 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:19Z","lastTransitionTime":"2025-12-16T14:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:19 crc kubenswrapper[4775]: E1216 14:56:19.126541 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4dbd1130-4ad9-49a4-81ac-e33bda81b192\\\",\\\"systemUUID\\\":\\\"1c1c08a3-d604-4a9e-b8da-c0df5af4d40b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:56:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:56:19 crc kubenswrapper[4775]: E1216 14:56:19.126789 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.128422 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.128478 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.128497 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.128521 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.128540 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:19Z","lastTransitionTime":"2025-12-16T14:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.231769 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.231823 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.231835 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.231852 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.231866 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:19Z","lastTransitionTime":"2025-12-16T14:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.333968 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.334012 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.334026 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.334048 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.334062 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:19Z","lastTransitionTime":"2025-12-16T14:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.338457 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:19 crc kubenswrapper[4775]: E1216 14:56:19.339374 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.437074 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.437125 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.437137 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.437154 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.437166 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:19Z","lastTransitionTime":"2025-12-16T14:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.540445 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.540493 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.540509 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.540536 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.540554 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:19Z","lastTransitionTime":"2025-12-16T14:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.643540 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.643612 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.643635 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.643663 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.643686 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:19Z","lastTransitionTime":"2025-12-16T14:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.746761 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.746805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.746816 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.746835 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.746849 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:19Z","lastTransitionTime":"2025-12-16T14:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.848703 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.848759 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.848772 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.848790 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.848804 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:19Z","lastTransitionTime":"2025-12-16T14:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.951704 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.951746 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.951764 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.951785 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:19 crc kubenswrapper[4775]: I1216 14:56:19.951798 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:19Z","lastTransitionTime":"2025-12-16T14:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.055237 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.055356 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.055374 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.055399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.055417 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:20Z","lastTransitionTime":"2025-12-16T14:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.158347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.158388 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.158399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.158413 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.158423 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:20Z","lastTransitionTime":"2025-12-16T14:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.261648 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.261685 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.261693 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.261707 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.261717 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:20Z","lastTransitionTime":"2025-12-16T14:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.337195 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.337244 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.337379 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:20 crc kubenswrapper[4775]: E1216 14:56:20.337512 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:20 crc kubenswrapper[4775]: E1216 14:56:20.337657 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:20 crc kubenswrapper[4775]: E1216 14:56:20.337781 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.365117 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.365176 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.365188 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.365209 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.365222 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:20Z","lastTransitionTime":"2025-12-16T14:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.467636 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.467698 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.467714 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.467732 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.467748 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:20Z","lastTransitionTime":"2025-12-16T14:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.570523 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.570571 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.570585 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.570603 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.570614 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:20Z","lastTransitionTime":"2025-12-16T14:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.673205 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.673246 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.673259 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.673276 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.673289 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:20Z","lastTransitionTime":"2025-12-16T14:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.776332 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.776379 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.776392 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.776411 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.776423 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:20Z","lastTransitionTime":"2025-12-16T14:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.878828 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.878918 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.878939 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.878963 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.878980 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:20Z","lastTransitionTime":"2025-12-16T14:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.982281 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.982339 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.982349 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.982368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:20 crc kubenswrapper[4775]: I1216 14:56:20.982380 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:20Z","lastTransitionTime":"2025-12-16T14:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.085571 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.085619 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.085630 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.085649 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.085661 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:21Z","lastTransitionTime":"2025-12-16T14:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.188511 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.188581 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.188600 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.188623 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.188640 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:21Z","lastTransitionTime":"2025-12-16T14:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.292345 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.292393 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.292407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.292424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.292438 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:21Z","lastTransitionTime":"2025-12-16T14:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.337460 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:21 crc kubenswrapper[4775]: E1216 14:56:21.337703 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.395566 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.395715 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.395737 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.395761 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.395799 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:21Z","lastTransitionTime":"2025-12-16T14:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.498942 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.498994 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.499005 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.499020 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.499031 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:21Z","lastTransitionTime":"2025-12-16T14:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.601677 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.601730 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.601740 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.601756 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.601767 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:21Z","lastTransitionTime":"2025-12-16T14:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.704361 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.704397 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.704407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.704422 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.704431 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:21Z","lastTransitionTime":"2025-12-16T14:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.807271 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.807328 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.807339 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.807353 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.807365 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:21Z","lastTransitionTime":"2025-12-16T14:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.910055 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.910124 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.910142 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.910170 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:21 crc kubenswrapper[4775]: I1216 14:56:21.910190 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:21Z","lastTransitionTime":"2025-12-16T14:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.013378 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.013454 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.013473 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.013509 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.013547 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:22Z","lastTransitionTime":"2025-12-16T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.116794 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.116849 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.116867 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.116936 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.116962 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:22Z","lastTransitionTime":"2025-12-16T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.219472 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.219536 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.219550 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.219570 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.219583 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:22Z","lastTransitionTime":"2025-12-16T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.323044 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.323083 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.323094 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.323111 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.323122 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:22Z","lastTransitionTime":"2025-12-16T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.337227 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.337395 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.337426 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:22 crc kubenswrapper[4775]: E1216 14:56:22.337738 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:22 crc kubenswrapper[4775]: E1216 14:56:22.337871 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:22 crc kubenswrapper[4775]: E1216 14:56:22.338048 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.425694 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.426193 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.426331 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.426449 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.426598 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:22Z","lastTransitionTime":"2025-12-16T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.529385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.529442 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.529456 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.529476 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.529488 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:22Z","lastTransitionTime":"2025-12-16T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.632929 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.632984 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.632997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.633015 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.633029 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:22Z","lastTransitionTime":"2025-12-16T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.735425 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.735470 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.735483 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.735501 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.735516 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:22Z","lastTransitionTime":"2025-12-16T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.837393 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.837455 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.837554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.837581 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.837596 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:22Z","lastTransitionTime":"2025-12-16T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.939954 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.940022 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.940040 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.940072 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:22 crc kubenswrapper[4775]: I1216 14:56:22.940095 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:22Z","lastTransitionTime":"2025-12-16T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.043532 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.043622 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.043674 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.043700 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.043717 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:23Z","lastTransitionTime":"2025-12-16T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.146489 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.146535 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.146546 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.146563 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.146577 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:23Z","lastTransitionTime":"2025-12-16T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.249609 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.249664 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.249680 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.249701 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.249713 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:23Z","lastTransitionTime":"2025-12-16T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.337248 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:23 crc kubenswrapper[4775]: E1216 14:56:23.337407 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.352098 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.352146 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.352158 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.352176 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.352190 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:23Z","lastTransitionTime":"2025-12-16T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.454783 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.454828 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.454839 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.454856 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.454866 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:23Z","lastTransitionTime":"2025-12-16T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.557836 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.557931 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.557951 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.557974 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.557994 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:23Z","lastTransitionTime":"2025-12-16T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.660660 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.660707 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.660720 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.660738 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.660751 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:23Z","lastTransitionTime":"2025-12-16T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.763708 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.763829 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.763841 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.763859 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.763872 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:23Z","lastTransitionTime":"2025-12-16T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.867052 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.867116 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.867127 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.867147 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.867162 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:23Z","lastTransitionTime":"2025-12-16T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.969970 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.970286 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.970312 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.970330 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:23 crc kubenswrapper[4775]: I1216 14:56:23.970366 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:23Z","lastTransitionTime":"2025-12-16T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.072806 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.072838 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.072848 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.072862 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.072872 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:24Z","lastTransitionTime":"2025-12-16T14:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.175411 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.175454 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.175468 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.175488 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.175502 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:24Z","lastTransitionTime":"2025-12-16T14:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.278190 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.278231 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.278243 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.278259 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.278283 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:24Z","lastTransitionTime":"2025-12-16T14:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.336978 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.337086 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:24 crc kubenswrapper[4775]: E1216 14:56:24.337127 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:24 crc kubenswrapper[4775]: E1216 14:56:24.337256 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.337588 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:24 crc kubenswrapper[4775]: E1216 14:56:24.337808 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.380639 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.380686 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.380701 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.380721 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.380735 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:24Z","lastTransitionTime":"2025-12-16T14:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.483738 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.483792 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.483808 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.483833 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.483855 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:24Z","lastTransitionTime":"2025-12-16T14:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.586983 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.588008 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.588053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.588084 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.588110 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:24Z","lastTransitionTime":"2025-12-16T14:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.692123 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.692181 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.692191 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.692215 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.692227 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:24Z","lastTransitionTime":"2025-12-16T14:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.796297 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.796366 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.796384 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.796409 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.796428 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:24Z","lastTransitionTime":"2025-12-16T14:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.899684 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.899734 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.899745 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.899767 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:24 crc kubenswrapper[4775]: I1216 14:56:24.899779 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:24Z","lastTransitionTime":"2025-12-16T14:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.002683 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.002802 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.002822 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.002845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.002862 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:25Z","lastTransitionTime":"2025-12-16T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.105971 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.106040 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.106066 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.106091 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.106109 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:25Z","lastTransitionTime":"2025-12-16T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.208435 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.208497 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.208516 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.208543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.208560 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:25Z","lastTransitionTime":"2025-12-16T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.312018 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.312093 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.312106 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.312130 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.312145 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:25Z","lastTransitionTime":"2025-12-16T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.337363 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:25 crc kubenswrapper[4775]: E1216 14:56:25.338142 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.361322 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-47t7r" podStartSLOduration=88.361305819 podStartE2EDuration="1m28.361305819s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:56:25.360964168 +0000 UTC m=+110.312043121" watchObservedRunningTime="2025-12-16 14:56:25.361305819 +0000 UTC m=+110.312384752" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.375755 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jv9gg" podStartSLOduration=87.375731191 podStartE2EDuration="1m27.375731191s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:56:25.375396862 +0000 UTC m=+110.326475795" watchObservedRunningTime="2025-12-16 14:56:25.375731191 +0000 UTC m=+110.326810114" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.420960 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.420992 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.421000 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.421015 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.421027 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:25Z","lastTransitionTime":"2025-12-16T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.431943 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=41.431923225 podStartE2EDuration="41.431923225s" podCreationTimestamp="2025-12-16 14:55:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:56:25.392078435 +0000 UTC m=+110.343157428" watchObservedRunningTime="2025-12-16 14:56:25.431923225 +0000 UTC m=+110.383002148" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.443952 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=87.443924602 podStartE2EDuration="1m27.443924602s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:56:25.4320647 +0000 UTC m=+110.383143693" watchObservedRunningTime="2025-12-16 14:56:25.443924602 +0000 UTC m=+110.395003535" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.470435 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podStartSLOduration=88.470402034 podStartE2EDuration="1m28.470402034s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:56:25.469861627 +0000 UTC m=+110.420940560" watchObservedRunningTime="2025-12-16 14:56:25.470402034 +0000 UTC m=+110.421481007" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.518321 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=91.518297538 podStartE2EDuration="1m31.518297538s" podCreationTimestamp="2025-12-16 14:54:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:56:25.517974697 +0000 UTC m=+110.469053620" watchObservedRunningTime="2025-12-16 14:56:25.518297538 +0000 UTC m=+110.469376461" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.523378 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.523629 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.523695 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.523756 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.523833 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:25Z","lastTransitionTime":"2025-12-16T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.560154 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-f2p7z" podStartSLOduration=88.560130351 podStartE2EDuration="1m28.560130351s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:56:25.543748086 +0000 UTC m=+110.494827009" watchObservedRunningTime="2025-12-16 14:56:25.560130351 +0000 UTC m=+110.511209274" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.575942 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mc2lg" podStartSLOduration=88.575922876 podStartE2EDuration="1m28.575922876s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:56:25.575903936 +0000 UTC m=+110.526982869" watchObservedRunningTime="2025-12-16 14:56:25.575922876 +0000 UTC m=+110.527001799" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.576647 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hftd7" podStartSLOduration=88.576640239 podStartE2EDuration="1m28.576640239s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:56:25.562505085 +0000 UTC m=+110.513584048" watchObservedRunningTime="2025-12-16 14:56:25.576640239 +0000 UTC m=+110.527719162" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.611004 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=86.610985867 podStartE2EDuration="1m26.610985867s" podCreationTimestamp="2025-12-16 14:54:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:56:25.591854046 +0000 UTC m=+110.542932979" watchObservedRunningTime="2025-12-16 14:56:25.610985867 +0000 UTC m=+110.562064790" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.611124 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=57.611121201 podStartE2EDuration="57.611121201s" podCreationTimestamp="2025-12-16 14:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:56:25.610386959 +0000 UTC m=+110.561465882" watchObservedRunningTime="2025-12-16 14:56:25.611121201 +0000 UTC m=+110.562200124" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.626250 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.626302 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.626315 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.626334 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.626347 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:25Z","lastTransitionTime":"2025-12-16T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.728641 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.728977 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.729149 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.729274 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.729413 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:25Z","lastTransitionTime":"2025-12-16T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.832938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.833000 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.833022 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.833050 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.833072 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:25Z","lastTransitionTime":"2025-12-16T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.935365 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.935424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.935437 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.935456 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:25 crc kubenswrapper[4775]: I1216 14:56:25.935467 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:25Z","lastTransitionTime":"2025-12-16T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.037378 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.037415 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.037424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.037439 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.037453 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:26Z","lastTransitionTime":"2025-12-16T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.151446 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.151477 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.151488 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.151504 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.151513 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:26Z","lastTransitionTime":"2025-12-16T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.254935 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.254995 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.255011 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.255035 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.255051 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:26Z","lastTransitionTime":"2025-12-16T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.337172 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:26 crc kubenswrapper[4775]: E1216 14:56:26.337412 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.337502 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.337798 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:26 crc kubenswrapper[4775]: E1216 14:56:26.337922 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:26 crc kubenswrapper[4775]: E1216 14:56:26.338379 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.358763 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.358850 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.358867 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.358952 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.358972 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:26Z","lastTransitionTime":"2025-12-16T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.462234 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.462300 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.462319 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.462345 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.462362 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:26Z","lastTransitionTime":"2025-12-16T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.566010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.566321 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.566387 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.566463 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.566531 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:26Z","lastTransitionTime":"2025-12-16T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.669254 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.669998 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.670113 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.670305 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.670437 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:26Z","lastTransitionTime":"2025-12-16T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.774772 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.775479 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.775699 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.775940 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.776157 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:26Z","lastTransitionTime":"2025-12-16T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.879144 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.879208 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.879221 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.879243 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.879258 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:26Z","lastTransitionTime":"2025-12-16T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.985388 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.985468 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.985496 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.985527 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:26 crc kubenswrapper[4775]: I1216 14:56:26.985553 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:26Z","lastTransitionTime":"2025-12-16T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.088422 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.088486 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.088499 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.088518 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.088530 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:27Z","lastTransitionTime":"2025-12-16T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.190765 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.190822 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.190835 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.190858 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.190872 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:27Z","lastTransitionTime":"2025-12-16T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.294219 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.294286 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.294304 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.294328 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.294347 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:27Z","lastTransitionTime":"2025-12-16T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.337278 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:27 crc kubenswrapper[4775]: E1216 14:56:27.337483 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.401033 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.401221 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.401246 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.401282 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.401302 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:27Z","lastTransitionTime":"2025-12-16T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.504016 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.504101 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.504119 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.504143 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.504160 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:27Z","lastTransitionTime":"2025-12-16T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.607844 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.607975 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.608000 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.608032 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.608055 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:27Z","lastTransitionTime":"2025-12-16T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.710953 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.711003 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.711011 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.711027 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.711037 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:27Z","lastTransitionTime":"2025-12-16T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.814017 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.814392 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.814469 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.814558 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.814641 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:27Z","lastTransitionTime":"2025-12-16T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.917156 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.917508 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.917790 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.917877 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:27 crc kubenswrapper[4775]: I1216 14:56:27.918006 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:27Z","lastTransitionTime":"2025-12-16T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.021413 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.021481 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.021500 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.021526 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.021543 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:28Z","lastTransitionTime":"2025-12-16T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.124802 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.124866 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.124922 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.124967 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.124987 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:28Z","lastTransitionTime":"2025-12-16T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.228136 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.228200 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.228222 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.228249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.228271 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:28Z","lastTransitionTime":"2025-12-16T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.330678 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.330715 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.330725 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.330740 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.330750 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:28Z","lastTransitionTime":"2025-12-16T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.337567 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.337636 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.337666 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:28 crc kubenswrapper[4775]: E1216 14:56:28.338355 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:28 crc kubenswrapper[4775]: E1216 14:56:28.338493 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:28 crc kubenswrapper[4775]: E1216 14:56:28.338614 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.433071 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.433100 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.433109 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.433122 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.433131 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:28Z","lastTransitionTime":"2025-12-16T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.535234 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.535274 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.535283 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.535301 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.535313 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:28Z","lastTransitionTime":"2025-12-16T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.638366 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.638747 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.638846 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.638973 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.639109 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:28Z","lastTransitionTime":"2025-12-16T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.741524 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.742203 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.742350 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.742448 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.742537 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:28Z","lastTransitionTime":"2025-12-16T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.845256 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.845636 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.845731 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.845827 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.845943 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:28Z","lastTransitionTime":"2025-12-16T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.948539 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.948606 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.948623 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.948647 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:28 crc kubenswrapper[4775]: I1216 14:56:28.948663 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:28Z","lastTransitionTime":"2025-12-16T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.052442 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.052526 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.052542 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.052589 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.052607 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:29Z","lastTransitionTime":"2025-12-16T14:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.156957 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.157019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.157036 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.157059 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.157079 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:29Z","lastTransitionTime":"2025-12-16T14:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.259339 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.259407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.259428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.259454 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.259484 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:29Z","lastTransitionTime":"2025-12-16T14:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.337748 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:29 crc kubenswrapper[4775]: E1216 14:56:29.337970 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.338855 4775 scope.go:117] "RemoveContainer" containerID="cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee" Dec 16 14:56:29 crc kubenswrapper[4775]: E1216 14:56:29.339165 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.362676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.362734 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.362747 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.362766 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.362778 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:29Z","lastTransitionTime":"2025-12-16T14:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.419865 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.419969 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.419988 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.420012 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.420031 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:56:29Z","lastTransitionTime":"2025-12-16T14:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.483177 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w"] Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.483700 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.487691 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.489380 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.490273 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.492101 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.638808 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/414d88af-062e-47ed-af78-4071939798d0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-26m5w\" (UID: \"414d88af-062e-47ed-af78-4071939798d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.638869 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/414d88af-062e-47ed-af78-4071939798d0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-26m5w\" (UID: \"414d88af-062e-47ed-af78-4071939798d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.638930 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/414d88af-062e-47ed-af78-4071939798d0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-26m5w\" (UID: \"414d88af-062e-47ed-af78-4071939798d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.638962 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/414d88af-062e-47ed-af78-4071939798d0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-26m5w\" (UID: \"414d88af-062e-47ed-af78-4071939798d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.638998 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/414d88af-062e-47ed-af78-4071939798d0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-26m5w\" (UID: \"414d88af-062e-47ed-af78-4071939798d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.739797 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/414d88af-062e-47ed-af78-4071939798d0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-26m5w\" (UID: \"414d88af-062e-47ed-af78-4071939798d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.740282 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/414d88af-062e-47ed-af78-4071939798d0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-26m5w\" (UID: \"414d88af-062e-47ed-af78-4071939798d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.740480 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/414d88af-062e-47ed-af78-4071939798d0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-26m5w\" (UID: \"414d88af-062e-47ed-af78-4071939798d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.740582 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/414d88af-062e-47ed-af78-4071939798d0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-26m5w\" (UID: \"414d88af-062e-47ed-af78-4071939798d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.740711 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/414d88af-062e-47ed-af78-4071939798d0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-26m5w\" (UID: \"414d88af-062e-47ed-af78-4071939798d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.740641 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/414d88af-062e-47ed-af78-4071939798d0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-26m5w\" (UID: \"414d88af-062e-47ed-af78-4071939798d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.740869 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/414d88af-062e-47ed-af78-4071939798d0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-26m5w\" (UID: \"414d88af-062e-47ed-af78-4071939798d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.741725 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/414d88af-062e-47ed-af78-4071939798d0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-26m5w\" (UID: \"414d88af-062e-47ed-af78-4071939798d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.750690 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/414d88af-062e-47ed-af78-4071939798d0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-26m5w\" (UID: \"414d88af-062e-47ed-af78-4071939798d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.765132 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/414d88af-062e-47ed-af78-4071939798d0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-26m5w\" (UID: \"414d88af-062e-47ed-af78-4071939798d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.801880 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.956017 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" event={"ID":"414d88af-062e-47ed-af78-4071939798d0","Type":"ContainerStarted","Data":"1ca78433b8bfe4467c77bd46027faefd7d0eb50de34d1ceffe57a851fbcb89a9"} Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.956435 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" event={"ID":"414d88af-062e-47ed-af78-4071939798d0","Type":"ContainerStarted","Data":"bb3506286fa8716f7137b6a67b58ac8435e9957a4685a61b3a167091aa9de464"} Dec 16 14:56:29 crc kubenswrapper[4775]: I1216 14:56:29.972464 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26m5w" podStartSLOduration=92.972440877 podStartE2EDuration="1m32.972440877s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:56:29.972416926 +0000 UTC m=+114.923495879" watchObservedRunningTime="2025-12-16 14:56:29.972440877 +0000 UTC m=+114.923519810" Dec 16 14:56:30 crc kubenswrapper[4775]: I1216 14:56:30.337479 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:30 crc kubenswrapper[4775]: I1216 14:56:30.337612 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:30 crc kubenswrapper[4775]: E1216 14:56:30.337683 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:30 crc kubenswrapper[4775]: E1216 14:56:30.337819 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:30 crc kubenswrapper[4775]: I1216 14:56:30.337951 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:30 crc kubenswrapper[4775]: E1216 14:56:30.338032 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:31 crc kubenswrapper[4775]: I1216 14:56:31.337141 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:31 crc kubenswrapper[4775]: E1216 14:56:31.337368 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:32 crc kubenswrapper[4775]: I1216 14:56:32.336751 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:32 crc kubenswrapper[4775]: I1216 14:56:32.336851 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:32 crc kubenswrapper[4775]: E1216 14:56:32.336933 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:32 crc kubenswrapper[4775]: E1216 14:56:32.337087 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:32 crc kubenswrapper[4775]: I1216 14:56:32.337124 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:32 crc kubenswrapper[4775]: E1216 14:56:32.337233 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:32 crc kubenswrapper[4775]: I1216 14:56:32.970344 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc2lg_f108f76f-c79a-42b0-b5ac-714d49d9a4d5/kube-multus/1.log" Dec 16 14:56:32 crc kubenswrapper[4775]: I1216 14:56:32.972087 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc2lg_f108f76f-c79a-42b0-b5ac-714d49d9a4d5/kube-multus/0.log" Dec 16 14:56:32 crc kubenswrapper[4775]: I1216 14:56:32.972147 4775 generic.go:334] "Generic (PLEG): container finished" podID="f108f76f-c79a-42b0-b5ac-714d49d9a4d5" containerID="df66b9c818cf970df880bf19cf5d511f23a4ff7bebd59e241339dd26e0ac8fa0" exitCode=1 Dec 16 14:56:32 crc kubenswrapper[4775]: I1216 14:56:32.972190 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mc2lg" event={"ID":"f108f76f-c79a-42b0-b5ac-714d49d9a4d5","Type":"ContainerDied","Data":"df66b9c818cf970df880bf19cf5d511f23a4ff7bebd59e241339dd26e0ac8fa0"} Dec 16 14:56:32 crc kubenswrapper[4775]: I1216 14:56:32.972234 4775 scope.go:117] "RemoveContainer" containerID="e83ce64a14a644d2784242aaded1086b8844e5c368fed9bc44b38a333ece9ec7" Dec 16 14:56:32 crc kubenswrapper[4775]: I1216 14:56:32.972785 4775 scope.go:117] "RemoveContainer" containerID="df66b9c818cf970df880bf19cf5d511f23a4ff7bebd59e241339dd26e0ac8fa0" Dec 16 14:56:32 crc kubenswrapper[4775]: E1216 14:56:32.973023 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-mc2lg_openshift-multus(f108f76f-c79a-42b0-b5ac-714d49d9a4d5)\"" pod="openshift-multus/multus-mc2lg" podUID="f108f76f-c79a-42b0-b5ac-714d49d9a4d5" Dec 16 14:56:33 crc kubenswrapper[4775]: I1216 14:56:33.337527 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:33 crc kubenswrapper[4775]: E1216 14:56:33.337708 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:33 crc kubenswrapper[4775]: I1216 14:56:33.977535 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc2lg_f108f76f-c79a-42b0-b5ac-714d49d9a4d5/kube-multus/1.log" Dec 16 14:56:34 crc kubenswrapper[4775]: I1216 14:56:34.337943 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:34 crc kubenswrapper[4775]: I1216 14:56:34.338007 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:34 crc kubenswrapper[4775]: I1216 14:56:34.337985 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:34 crc kubenswrapper[4775]: E1216 14:56:34.338158 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:34 crc kubenswrapper[4775]: E1216 14:56:34.338313 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:34 crc kubenswrapper[4775]: E1216 14:56:34.338468 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:35 crc kubenswrapper[4775]: E1216 14:56:35.335271 4775 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 16 14:56:35 crc kubenswrapper[4775]: I1216 14:56:35.337804 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:35 crc kubenswrapper[4775]: E1216 14:56:35.338728 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:35 crc kubenswrapper[4775]: E1216 14:56:35.726398 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 14:56:36 crc kubenswrapper[4775]: I1216 14:56:36.337439 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:36 crc kubenswrapper[4775]: I1216 14:56:36.337487 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:36 crc kubenswrapper[4775]: I1216 14:56:36.337447 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:36 crc kubenswrapper[4775]: E1216 14:56:36.337612 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:36 crc kubenswrapper[4775]: E1216 14:56:36.337983 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:36 crc kubenswrapper[4775]: E1216 14:56:36.338026 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:37 crc kubenswrapper[4775]: I1216 14:56:37.336953 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:37 crc kubenswrapper[4775]: E1216 14:56:37.337199 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:38 crc kubenswrapper[4775]: I1216 14:56:38.337080 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:38 crc kubenswrapper[4775]: I1216 14:56:38.337198 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:38 crc kubenswrapper[4775]: E1216 14:56:38.337265 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:38 crc kubenswrapper[4775]: I1216 14:56:38.337203 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:38 crc kubenswrapper[4775]: E1216 14:56:38.337382 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:38 crc kubenswrapper[4775]: E1216 14:56:38.337689 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:39 crc kubenswrapper[4775]: I1216 14:56:39.337787 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:39 crc kubenswrapper[4775]: E1216 14:56:39.338064 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:40 crc kubenswrapper[4775]: I1216 14:56:40.337191 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:40 crc kubenswrapper[4775]: I1216 14:56:40.337299 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:40 crc kubenswrapper[4775]: E1216 14:56:40.337347 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:40 crc kubenswrapper[4775]: E1216 14:56:40.337468 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:40 crc kubenswrapper[4775]: I1216 14:56:40.337557 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:40 crc kubenswrapper[4775]: E1216 14:56:40.337610 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:40 crc kubenswrapper[4775]: E1216 14:56:40.728392 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 14:56:41 crc kubenswrapper[4775]: I1216 14:56:41.337525 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:41 crc kubenswrapper[4775]: E1216 14:56:41.337690 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:42 crc kubenswrapper[4775]: I1216 14:56:42.337096 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:42 crc kubenswrapper[4775]: I1216 14:56:42.337097 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:42 crc kubenswrapper[4775]: I1216 14:56:42.337125 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:42 crc kubenswrapper[4775]: E1216 14:56:42.337510 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:42 crc kubenswrapper[4775]: E1216 14:56:42.337724 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:42 crc kubenswrapper[4775]: E1216 14:56:42.337811 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:42 crc kubenswrapper[4775]: I1216 14:56:42.338525 4775 scope.go:117] "RemoveContainer" containerID="cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee" Dec 16 14:56:42 crc kubenswrapper[4775]: E1216 14:56:42.338722 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-79w7z_openshift-ovn-kubernetes(524488dd-74ee-43ea-ac0f-5e04d59af434)\"" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" Dec 16 14:56:43 crc kubenswrapper[4775]: I1216 14:56:43.337828 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:43 crc kubenswrapper[4775]: E1216 14:56:43.338058 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:44 crc kubenswrapper[4775]: I1216 14:56:44.336821 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:44 crc kubenswrapper[4775]: I1216 14:56:44.336845 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:44 crc kubenswrapper[4775]: I1216 14:56:44.336845 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:44 crc kubenswrapper[4775]: E1216 14:56:44.337084 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:44 crc kubenswrapper[4775]: E1216 14:56:44.337239 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:44 crc kubenswrapper[4775]: E1216 14:56:44.337386 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:45 crc kubenswrapper[4775]: I1216 14:56:45.337698 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:45 crc kubenswrapper[4775]: E1216 14:56:45.339512 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:45 crc kubenswrapper[4775]: E1216 14:56:45.730097 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 14:56:46 crc kubenswrapper[4775]: I1216 14:56:46.336810 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:46 crc kubenswrapper[4775]: I1216 14:56:46.336914 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:46 crc kubenswrapper[4775]: E1216 14:56:46.337016 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:46 crc kubenswrapper[4775]: E1216 14:56:46.337102 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:46 crc kubenswrapper[4775]: I1216 14:56:46.337490 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:46 crc kubenswrapper[4775]: E1216 14:56:46.337618 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:47 crc kubenswrapper[4775]: I1216 14:56:47.337422 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:47 crc kubenswrapper[4775]: E1216 14:56:47.337671 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:47 crc kubenswrapper[4775]: I1216 14:56:47.337977 4775 scope.go:117] "RemoveContainer" containerID="df66b9c818cf970df880bf19cf5d511f23a4ff7bebd59e241339dd26e0ac8fa0" Dec 16 14:56:48 crc kubenswrapper[4775]: I1216 14:56:48.028495 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc2lg_f108f76f-c79a-42b0-b5ac-714d49d9a4d5/kube-multus/1.log" Dec 16 14:56:48 crc kubenswrapper[4775]: I1216 14:56:48.028858 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mc2lg" event={"ID":"f108f76f-c79a-42b0-b5ac-714d49d9a4d5","Type":"ContainerStarted","Data":"bd5bf8d9aa860c638df224881d6e2c78b66ea54d6e2e871aebcdf55ac2dc99ce"} Dec 16 14:56:48 crc kubenswrapper[4775]: I1216 14:56:48.337830 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:48 crc kubenswrapper[4775]: E1216 14:56:48.338031 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:48 crc kubenswrapper[4775]: I1216 14:56:48.337860 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:48 crc kubenswrapper[4775]: I1216 14:56:48.337861 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:48 crc kubenswrapper[4775]: E1216 14:56:48.338250 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:48 crc kubenswrapper[4775]: E1216 14:56:48.338337 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:49 crc kubenswrapper[4775]: I1216 14:56:49.336858 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:49 crc kubenswrapper[4775]: E1216 14:56:49.337076 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:50 crc kubenswrapper[4775]: I1216 14:56:50.336823 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:50 crc kubenswrapper[4775]: I1216 14:56:50.336933 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:50 crc kubenswrapper[4775]: E1216 14:56:50.336979 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:50 crc kubenswrapper[4775]: I1216 14:56:50.336933 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:50 crc kubenswrapper[4775]: E1216 14:56:50.337144 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:50 crc kubenswrapper[4775]: E1216 14:56:50.337078 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:50 crc kubenswrapper[4775]: E1216 14:56:50.731117 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 14:56:51 crc kubenswrapper[4775]: I1216 14:56:51.337431 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:51 crc kubenswrapper[4775]: E1216 14:56:51.337667 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:52 crc kubenswrapper[4775]: I1216 14:56:52.337596 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:52 crc kubenswrapper[4775]: E1216 14:56:52.337770 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:52 crc kubenswrapper[4775]: I1216 14:56:52.337979 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:52 crc kubenswrapper[4775]: I1216 14:56:52.338103 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:52 crc kubenswrapper[4775]: E1216 14:56:52.338265 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:52 crc kubenswrapper[4775]: E1216 14:56:52.338362 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:53 crc kubenswrapper[4775]: I1216 14:56:53.337761 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:53 crc kubenswrapper[4775]: E1216 14:56:53.338006 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:54 crc kubenswrapper[4775]: I1216 14:56:54.337445 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:54 crc kubenswrapper[4775]: E1216 14:56:54.337665 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:54 crc kubenswrapper[4775]: I1216 14:56:54.338010 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:54 crc kubenswrapper[4775]: I1216 14:56:54.338060 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:54 crc kubenswrapper[4775]: E1216 14:56:54.338214 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:54 crc kubenswrapper[4775]: E1216 14:56:54.338317 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:55 crc kubenswrapper[4775]: I1216 14:56:55.337405 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:55 crc kubenswrapper[4775]: E1216 14:56:55.338783 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:55 crc kubenswrapper[4775]: E1216 14:56:55.732486 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 14:56:56 crc kubenswrapper[4775]: I1216 14:56:56.337357 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:56 crc kubenswrapper[4775]: I1216 14:56:56.337467 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:56 crc kubenswrapper[4775]: I1216 14:56:56.337357 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:56 crc kubenswrapper[4775]: E1216 14:56:56.337592 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:56 crc kubenswrapper[4775]: E1216 14:56:56.337499 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:56 crc kubenswrapper[4775]: E1216 14:56:56.337764 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:57 crc kubenswrapper[4775]: I1216 14:56:57.337614 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:57 crc kubenswrapper[4775]: E1216 14:56:57.337816 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:56:57 crc kubenswrapper[4775]: I1216 14:56:57.338952 4775 scope.go:117] "RemoveContainer" containerID="cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee" Dec 16 14:56:58 crc kubenswrapper[4775]: I1216 14:56:58.337185 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:58 crc kubenswrapper[4775]: I1216 14:56:58.337249 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:56:58 crc kubenswrapper[4775]: E1216 14:56:58.337760 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:58 crc kubenswrapper[4775]: E1216 14:56:58.337853 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:56:58 crc kubenswrapper[4775]: I1216 14:56:58.338018 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:56:58 crc kubenswrapper[4775]: E1216 14:56:58.338080 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:56:58 crc kubenswrapper[4775]: I1216 14:56:58.843496 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c6mdt"] Dec 16 14:56:59 crc kubenswrapper[4775]: I1216 14:56:59.067793 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovnkube-controller/3.log" Dec 16 14:56:59 crc kubenswrapper[4775]: I1216 14:56:59.071211 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerStarted","Data":"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7"} Dec 16 14:56:59 crc kubenswrapper[4775]: I1216 14:56:59.071230 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:56:59 crc kubenswrapper[4775]: I1216 14:56:59.071927 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:56:59 crc kubenswrapper[4775]: E1216 14:56:59.072344 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:56:59 crc kubenswrapper[4775]: I1216 14:56:59.101723 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podStartSLOduration=122.101693692 podStartE2EDuration="2m2.101693692s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:56:59.100620968 +0000 UTC m=+144.051699911" watchObservedRunningTime="2025-12-16 14:56:59.101693692 +0000 UTC m=+144.052772615" Dec 16 14:56:59 crc kubenswrapper[4775]: I1216 14:56:59.337386 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:56:59 crc kubenswrapper[4775]: E1216 14:56:59.337595 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:00 crc kubenswrapper[4775]: I1216 14:57:00.337871 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:00 crc kubenswrapper[4775]: I1216 14:57:00.337976 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:57:00 crc kubenswrapper[4775]: E1216 14:57:00.338163 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:00 crc kubenswrapper[4775]: I1216 14:57:00.338240 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:00 crc kubenswrapper[4775]: E1216 14:57:00.338496 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:57:00 crc kubenswrapper[4775]: E1216 14:57:00.338601 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:00 crc kubenswrapper[4775]: E1216 14:57:00.734095 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 14:57:01 crc kubenswrapper[4775]: I1216 14:57:01.337592 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:01 crc kubenswrapper[4775]: E1216 14:57:01.337874 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:02 crc kubenswrapper[4775]: I1216 14:57:02.251871 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:02 crc kubenswrapper[4775]: E1216 14:57:02.252279 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:04.252199164 +0000 UTC m=+269.203278147 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:02 crc kubenswrapper[4775]: I1216 14:57:02.252401 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:02 crc kubenswrapper[4775]: E1216 14:57:02.252589 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:57:02 crc kubenswrapper[4775]: E1216 14:57:02.252661 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:59:04.252643438 +0000 UTC m=+269.203722401 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:57:02 crc kubenswrapper[4775]: E1216 14:57:02.252686 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:57:02 crc kubenswrapper[4775]: E1216 14:57:02.252761 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:59:04.252738791 +0000 UTC m=+269.203817754 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:57:02 crc kubenswrapper[4775]: I1216 14:57:02.252561 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:02 crc kubenswrapper[4775]: I1216 14:57:02.338318 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:57:02 crc kubenswrapper[4775]: I1216 14:57:02.338406 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:02 crc kubenswrapper[4775]: I1216 14:57:02.338524 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:02 crc kubenswrapper[4775]: E1216 14:57:02.338527 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:57:02 crc kubenswrapper[4775]: E1216 14:57:02.338701 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:02 crc kubenswrapper[4775]: E1216 14:57:02.341964 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:02 crc kubenswrapper[4775]: I1216 14:57:02.354594 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:02 crc kubenswrapper[4775]: I1216 14:57:02.354682 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:02 crc kubenswrapper[4775]: E1216 14:57:02.354835 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:57:02 crc kubenswrapper[4775]: E1216 14:57:02.354871 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:57:02 crc kubenswrapper[4775]: E1216 14:57:02.354908 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:02 crc kubenswrapper[4775]: E1216 14:57:02.354882 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:57:02 crc kubenswrapper[4775]: E1216 14:57:02.354954 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:57:02 crc kubenswrapper[4775]: E1216 14:57:02.354974 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:02 crc kubenswrapper[4775]: E1216 14:57:02.354984 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 14:59:04.354961236 +0000 UTC m=+269.306040179 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:02 crc kubenswrapper[4775]: E1216 14:57:02.355046 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 14:59:04.355021728 +0000 UTC m=+269.306100691 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:02 crc kubenswrapper[4775]: I1216 14:57:02.869673 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 14:57:02 crc kubenswrapper[4775]: I1216 14:57:02.869766 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 14:57:03 crc kubenswrapper[4775]: I1216 14:57:03.337118 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:03 crc kubenswrapper[4775]: E1216 14:57:03.337295 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:04 crc kubenswrapper[4775]: I1216 14:57:04.337396 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:04 crc kubenswrapper[4775]: I1216 14:57:04.337440 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:57:04 crc kubenswrapper[4775]: I1216 14:57:04.337556 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:04 crc kubenswrapper[4775]: E1216 14:57:04.337554 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:04 crc kubenswrapper[4775]: E1216 14:57:04.337674 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:04 crc kubenswrapper[4775]: E1216 14:57:04.337776 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6mdt" podUID="3d592ae8-792f-4cc5-9a32-b278deb33810" Dec 16 14:57:05 crc kubenswrapper[4775]: I1216 14:57:05.337330 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:05 crc kubenswrapper[4775]: E1216 14:57:05.339209 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:06 crc kubenswrapper[4775]: I1216 14:57:06.337215 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:06 crc kubenswrapper[4775]: I1216 14:57:06.337269 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:57:06 crc kubenswrapper[4775]: I1216 14:57:06.337306 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:06 crc kubenswrapper[4775]: I1216 14:57:06.339872 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 16 14:57:06 crc kubenswrapper[4775]: I1216 14:57:06.340099 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 16 14:57:06 crc kubenswrapper[4775]: I1216 14:57:06.340129 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 16 14:57:06 crc kubenswrapper[4775]: I1216 14:57:06.340130 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 16 14:57:07 crc kubenswrapper[4775]: I1216 14:57:07.337167 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:07 crc kubenswrapper[4775]: I1216 14:57:07.341260 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 16 14:57:07 crc kubenswrapper[4775]: I1216 14:57:07.342718 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.498192 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.546600 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nxkgw"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.547425 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.549849 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5fjmz"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.550859 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5fjmz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.557597 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.558343 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.560005 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.560820 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.560019 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.566878 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.567228 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.567401 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.567743 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.560127 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.560201 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.560207 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.561280 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.566724 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.569539 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.571173 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.576184 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.581844 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-27jm2"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.582663 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-27jm2" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.582715 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.583382 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ncw52"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.584114 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.585306 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.586309 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-twhnr"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.586984 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.587669 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.588218 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.588750 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.588932 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.589092 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.589748 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.589878 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.590494 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.590674 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.590775 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.590688 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.591058 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.591264 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.591291 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.591533 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.591862 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.595543 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.595961 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.596136 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.596308 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.604058 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.606203 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s88th"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.606521 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.606812 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.607009 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s88th" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.609955 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.610908 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.610999 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.611256 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bgqpz"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.611442 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.611598 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.611876 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bgqpz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.613735 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.614284 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.614816 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.615121 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.615142 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.616016 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.617948 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v798k"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.618474 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v798k" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.618963 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.619502 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.622150 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.622327 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-fc2jr"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.623111 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.623639 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.624356 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.626349 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-5bmtw"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.626824 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5bmtw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.632336 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.633151 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ms9lk"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.634162 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.634192 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-twhnr"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.634304 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.664392 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.667956 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/68f674b8-b7c3-43e8-b132-7d6b881cbd31-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-twhnr\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668014 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d55038e1-9978-48f8-b430-78c7da1ca5e5-encryption-config\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668048 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72557843-c8c0-4228-89fc-e735935e10a3-serving-cert\") pod \"openshift-config-operator-7777fb866f-xrjhk\" (UID: \"72557843-c8c0-4228-89fc-e735935e10a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668074 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/be0d6859-aa4c-4a58-97ea-3f3657d4773f-etcd-client\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668101 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqpxb\" (UniqueName: \"kubernetes.io/projected/42c1b38d-39c7-4bfd-ae8e-c8ef6651565f-kube-api-access-xqpxb\") pod \"authentication-operator-69f744f599-ncw52\" (UID: \"42c1b38d-39c7-4bfd-ae8e-c8ef6651565f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668130 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmg2m\" (UniqueName: \"kubernetes.io/projected/e1a2834e-159c-47f0-81a8-87d37d89a22a-kube-api-access-gmg2m\") pod \"machine-api-operator-5694c8668f-5fjmz\" (UID: \"e1a2834e-159c-47f0-81a8-87d37d89a22a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5fjmz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668175 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1a2834e-159c-47f0-81a8-87d37d89a22a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5fjmz\" (UID: \"e1a2834e-159c-47f0-81a8-87d37d89a22a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5fjmz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668205 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d55038e1-9978-48f8-b430-78c7da1ca5e5-etcd-serving-ca\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668232 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d55038e1-9978-48f8-b430-78c7da1ca5e5-audit-dir\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668262 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a3a6d07-7d02-4d47-836e-75a930433d87-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-89cwl\" (UID: \"2a3a6d07-7d02-4d47-836e-75a930433d87\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668291 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tq2g\" (UniqueName: \"kubernetes.io/projected/54b9e242-469b-450e-b2a1-741d9ee601a1-kube-api-access-2tq2g\") pod \"console-operator-58897d9998-bgqpz\" (UID: \"54b9e242-469b-450e-b2a1-741d9ee601a1\") " pod="openshift-console-operator/console-operator-58897d9998-bgqpz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668336 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d55038e1-9978-48f8-b430-78c7da1ca5e5-config\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668369 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be0d6859-aa4c-4a58-97ea-3f3657d4773f-serving-cert\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668395 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/be0d6859-aa4c-4a58-97ea-3f3657d4773f-encryption-config\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668426 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42c1b38d-39c7-4bfd-ae8e-c8ef6651565f-serving-cert\") pod \"authentication-operator-69f744f599-ncw52\" (UID: \"42c1b38d-39c7-4bfd-ae8e-c8ef6651565f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668428 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668482 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r66w7\" (UniqueName: \"kubernetes.io/projected/42c55268-c122-4eb0-9508-b5507897990b-kube-api-access-r66w7\") pod \"openshift-controller-manager-operator-756b6f6bc6-s88th\" (UID: \"42c55268-c122-4eb0-9508-b5507897990b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s88th" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668595 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668802 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68f674b8-b7c3-43e8-b132-7d6b881cbd31-serving-cert\") pod \"controller-manager-879f6c89f-twhnr\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668850 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be0d6859-aa4c-4a58-97ea-3f3657d4773f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668879 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq72f\" (UniqueName: \"kubernetes.io/projected/2a3a6d07-7d02-4d47-836e-75a930433d87-kube-api-access-kq72f\") pod \"cluster-image-registry-operator-dc59b4c8b-89cwl\" (UID: \"2a3a6d07-7d02-4d47-836e-75a930433d87\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668921 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d55038e1-9978-48f8-b430-78c7da1ca5e5-audit\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.668990 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-config\") pod \"route-controller-manager-6576b87f9c-dqxgd\" (UID: \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.669023 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-serving-cert\") pod \"route-controller-manager-6576b87f9c-dqxgd\" (UID: \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.669051 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68f674b8-b7c3-43e8-b132-7d6b881cbd31-client-ca\") pod \"controller-manager-879f6c89f-twhnr\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.669087 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/be0d6859-aa4c-4a58-97ea-3f3657d4773f-audit-policies\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.669121 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42c1b38d-39c7-4bfd-ae8e-c8ef6651565f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ncw52\" (UID: \"42c1b38d-39c7-4bfd-ae8e-c8ef6651565f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.669193 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-client-ca\") pod \"route-controller-manager-6576b87f9c-dqxgd\" (UID: \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.669257 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42c1b38d-39c7-4bfd-ae8e-c8ef6651565f-service-ca-bundle\") pod \"authentication-operator-69f744f599-ncw52\" (UID: \"42c1b38d-39c7-4bfd-ae8e-c8ef6651565f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.669289 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/72557843-c8c0-4228-89fc-e735935e10a3-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xrjhk\" (UID: \"72557843-c8c0-4228-89fc-e735935e10a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.669711 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-974sg\" (UniqueName: \"kubernetes.io/projected/d55038e1-9978-48f8-b430-78c7da1ca5e5-kube-api-access-974sg\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.669753 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/be0d6859-aa4c-4a58-97ea-3f3657d4773f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.669816 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2a3a6d07-7d02-4d47-836e-75a930433d87-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-89cwl\" (UID: \"2a3a6d07-7d02-4d47-836e-75a930433d87\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.669843 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d55038e1-9978-48f8-b430-78c7da1ca5e5-node-pullsecrets\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.669887 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42c55268-c122-4eb0-9508-b5507897990b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-s88th\" (UID: \"42c55268-c122-4eb0-9508-b5507897990b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s88th" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.669934 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a2834e-159c-47f0-81a8-87d37d89a22a-config\") pod \"machine-api-operator-5694c8668f-5fjmz\" (UID: \"e1a2834e-159c-47f0-81a8-87d37d89a22a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5fjmz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.670054 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.670179 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.670313 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54b9e242-469b-450e-b2a1-741d9ee601a1-serving-cert\") pod \"console-operator-58897d9998-bgqpz\" (UID: \"54b9e242-469b-450e-b2a1-741d9ee601a1\") " pod="openshift-console-operator/console-operator-58897d9998-bgqpz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.670366 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnf49\" (UniqueName: \"kubernetes.io/projected/be0d6859-aa4c-4a58-97ea-3f3657d4773f-kube-api-access-jnf49\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.670409 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.670463 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d55038e1-9978-48f8-b430-78c7da1ca5e5-image-import-ca\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.670491 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.670549 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42c1b38d-39c7-4bfd-ae8e-c8ef6651565f-config\") pod \"authentication-operator-69f744f599-ncw52\" (UID: \"42c1b38d-39c7-4bfd-ae8e-c8ef6651565f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.670656 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.670708 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e1a2834e-159c-47f0-81a8-87d37d89a22a-images\") pod \"machine-api-operator-5694c8668f-5fjmz\" (UID: \"e1a2834e-159c-47f0-81a8-87d37d89a22a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5fjmz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.670762 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d55038e1-9978-48f8-b430-78c7da1ca5e5-etcd-client\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.670898 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.670933 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zq42\" (UniqueName: \"kubernetes.io/projected/68f674b8-b7c3-43e8-b132-7d6b881cbd31-kube-api-access-9zq42\") pod \"controller-manager-879f6c89f-twhnr\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.671042 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a3a6d07-7d02-4d47-836e-75a930433d87-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-89cwl\" (UID: \"2a3a6d07-7d02-4d47-836e-75a930433d87\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.671102 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68f674b8-b7c3-43e8-b132-7d6b881cbd31-config\") pod \"controller-manager-879f6c89f-twhnr\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.671154 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxzrv\" (UniqueName: \"kubernetes.io/projected/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-kube-api-access-vxzrv\") pod \"route-controller-manager-6576b87f9c-dqxgd\" (UID: \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.671196 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54b9e242-469b-450e-b2a1-741d9ee601a1-trusted-ca\") pod \"console-operator-58897d9998-bgqpz\" (UID: \"54b9e242-469b-450e-b2a1-741d9ee601a1\") " pod="openshift-console-operator/console-operator-58897d9998-bgqpz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.671299 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54b9e242-469b-450e-b2a1-741d9ee601a1-config\") pod \"console-operator-58897d9998-bgqpz\" (UID: \"54b9e242-469b-450e-b2a1-741d9ee601a1\") " pod="openshift-console-operator/console-operator-58897d9998-bgqpz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.671346 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83cafff0-05df-46bf-a875-f03ad748a9fc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-27jm2\" (UID: \"83cafff0-05df-46bf-a875-f03ad748a9fc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-27jm2" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.671379 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.671572 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83cafff0-05df-46bf-a875-f03ad748a9fc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-27jm2\" (UID: \"83cafff0-05df-46bf-a875-f03ad748a9fc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-27jm2" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.671607 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9vc8\" (UniqueName: \"kubernetes.io/projected/83cafff0-05df-46bf-a875-f03ad748a9fc-kube-api-access-f9vc8\") pod \"openshift-apiserver-operator-796bbdcf4f-27jm2\" (UID: \"83cafff0-05df-46bf-a875-f03ad748a9fc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-27jm2" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.671654 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4zbr\" (UniqueName: \"kubernetes.io/projected/72557843-c8c0-4228-89fc-e735935e10a3-kube-api-access-b4zbr\") pod \"openshift-config-operator-7777fb866f-xrjhk\" (UID: \"72557843-c8c0-4228-89fc-e735935e10a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.671687 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42c55268-c122-4eb0-9508-b5507897990b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-s88th\" (UID: \"42c55268-c122-4eb0-9508-b5507897990b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s88th" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.671751 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d55038e1-9978-48f8-b430-78c7da1ca5e5-serving-cert\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.671798 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d55038e1-9978-48f8-b430-78c7da1ca5e5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.671834 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be0d6859-aa4c-4a58-97ea-3f3657d4773f-audit-dir\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.673209 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.673586 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.673726 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.673893 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.674084 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.674263 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.675611 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.675940 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.676192 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.676558 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.676775 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.676935 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.677186 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.677222 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.676945 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.677223 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.677445 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.686204 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.677383 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.686630 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.686883 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.699063 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.699130 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.699218 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.699386 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.699504 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.699872 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.699993 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-27jm2"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.700099 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.700500 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.701057 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.701087 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.701878 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.701985 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.702086 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.702230 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.701195 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.703811 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.703545 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s88th"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.703489 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.703975 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.704015 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.704038 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.709720 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.718355 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sxl49"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.719609 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.722888 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.723518 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rpxqq"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.724379 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rpxqq" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.724717 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2tkrk"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.725122 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2tkrk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.725326 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zmgzc"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.725937 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zmgzc" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.726195 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rfplx"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.726521 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rfplx" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.728825 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ql6wt"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.729087 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.729463 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.729699 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ql6wt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.730543 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.734225 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.735492 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.738339 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.740030 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-d9jzf"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.741184 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-d9jzf" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.742011 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b5vwq"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.742638 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.742784 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.744945 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.752577 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.752857 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b5vwq" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.761421 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-dw4sb"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.763286 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pw4wv"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.763519 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.765089 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pw4wv" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.766717 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-58rfh"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.767512 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.768015 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-t668z"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.768880 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-t668z" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.769317 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-54js6"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.769858 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-54js6" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.770220 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9gmd5"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.771022 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9gmd5" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.771160 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772068 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772376 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2r426"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772480 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d55038e1-9978-48f8-b430-78c7da1ca5e5-audit\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772523 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772553 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-config\") pod \"route-controller-manager-6576b87f9c-dqxgd\" (UID: \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772581 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772608 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwl9g\" (UniqueName: \"kubernetes.io/projected/2b5f39f2-f4e2-4306-b64c-669ca82f8869-kube-api-access-lwl9g\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772633 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-serving-cert\") pod \"route-controller-manager-6576b87f9c-dqxgd\" (UID: \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772650 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68f674b8-b7c3-43e8-b132-7d6b881cbd31-client-ca\") pod \"controller-manager-879f6c89f-twhnr\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772666 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/be0d6859-aa4c-4a58-97ea-3f3657d4773f-audit-policies\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772701 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42c1b38d-39c7-4bfd-ae8e-c8ef6651565f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ncw52\" (UID: \"42c1b38d-39c7-4bfd-ae8e-c8ef6651565f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772719 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-console-config\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772745 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-client-ca\") pod \"route-controller-manager-6576b87f9c-dqxgd\" (UID: \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772763 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42c1b38d-39c7-4bfd-ae8e-c8ef6651565f-service-ca-bundle\") pod \"authentication-operator-69f744f599-ncw52\" (UID: \"42c1b38d-39c7-4bfd-ae8e-c8ef6651565f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772781 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/72557843-c8c0-4228-89fc-e735935e10a3-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xrjhk\" (UID: \"72557843-c8c0-4228-89fc-e735935e10a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772800 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/be0d6859-aa4c-4a58-97ea-3f3657d4773f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772817 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2a3a6d07-7d02-4d47-836e-75a930433d87-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-89cwl\" (UID: \"2a3a6d07-7d02-4d47-836e-75a930433d87\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772835 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-974sg\" (UniqueName: \"kubernetes.io/projected/d55038e1-9978-48f8-b430-78c7da1ca5e5-kube-api-access-974sg\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772857 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c21af7b0-6f27-43de-8c44-6e6519262019-console-serving-cert\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772881 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d55038e1-9978-48f8-b430-78c7da1ca5e5-node-pullsecrets\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772951 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42c55268-c122-4eb0-9508-b5507897990b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-s88th\" (UID: \"42c55268-c122-4eb0-9508-b5507897990b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s88th" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772974 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772982 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.772999 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a2834e-159c-47f0-81a8-87d37d89a22a-config\") pod \"machine-api-operator-5694c8668f-5fjmz\" (UID: \"e1a2834e-159c-47f0-81a8-87d37d89a22a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5fjmz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.773023 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4e5a6b47-360c-4b64-9ba3-15edeb2006fa-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-v798k\" (UID: \"4e5a6b47-360c-4b64-9ba3-15edeb2006fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v798k" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.773048 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.773069 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcfwq\" (UniqueName: \"kubernetes.io/projected/56428379-949d-4ba8-9b32-8ee7432abba7-kube-api-access-wcfwq\") pod \"machine-approver-56656f9798-xklhw\" (UID: \"56428379-949d-4ba8-9b32-8ee7432abba7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.773076 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.773349 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d55038e1-9978-48f8-b430-78c7da1ca5e5-audit\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.773827 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.774473 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-config\") pod \"route-controller-manager-6576b87f9c-dqxgd\" (UID: \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.773098 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.774617 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54b9e242-469b-450e-b2a1-741d9ee601a1-serving-cert\") pod \"console-operator-58897d9998-bgqpz\" (UID: \"54b9e242-469b-450e-b2a1-741d9ee601a1\") " pod="openshift-console-operator/console-operator-58897d9998-bgqpz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.774663 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnf49\" (UniqueName: \"kubernetes.io/projected/be0d6859-aa4c-4a58-97ea-3f3657d4773f-kube-api-access-jnf49\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.774704 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/56428379-949d-4ba8-9b32-8ee7432abba7-auth-proxy-config\") pod \"machine-approver-56656f9798-xklhw\" (UID: \"56428379-949d-4ba8-9b32-8ee7432abba7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.774791 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d55038e1-9978-48f8-b430-78c7da1ca5e5-image-import-ca\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.774839 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42c1b38d-39c7-4bfd-ae8e-c8ef6651565f-config\") pod \"authentication-operator-69f744f599-ncw52\" (UID: \"42c1b38d-39c7-4bfd-ae8e-c8ef6651565f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.774878 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.774939 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-service-ca\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.774966 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.774994 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e1a2834e-159c-47f0-81a8-87d37d89a22a-images\") pod \"machine-api-operator-5694c8668f-5fjmz\" (UID: \"e1a2834e-159c-47f0-81a8-87d37d89a22a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5fjmz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775020 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d55038e1-9978-48f8-b430-78c7da1ca5e5-etcd-client\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775047 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxr94\" (UniqueName: \"kubernetes.io/projected/edb75c2a-9e6f-4a80-aadd-38416ba9c9a4-kube-api-access-dxr94\") pod \"downloads-7954f5f757-5bmtw\" (UID: \"edb75c2a-9e6f-4a80-aadd-38416ba9c9a4\") " pod="openshift-console/downloads-7954f5f757-5bmtw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775086 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zq42\" (UniqueName: \"kubernetes.io/projected/68f674b8-b7c3-43e8-b132-7d6b881cbd31-kube-api-access-9zq42\") pod \"controller-manager-879f6c89f-twhnr\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775112 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b5f39f2-f4e2-4306-b64c-669ca82f8869-audit-dir\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775146 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c21af7b0-6f27-43de-8c44-6e6519262019-console-oauth-config\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775165 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775281 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a3a6d07-7d02-4d47-836e-75a930433d87-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-89cwl\" (UID: \"2a3a6d07-7d02-4d47-836e-75a930433d87\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775331 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775363 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68f674b8-b7c3-43e8-b132-7d6b881cbd31-config\") pod \"controller-manager-879f6c89f-twhnr\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775390 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxzrv\" (UniqueName: \"kubernetes.io/projected/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-kube-api-access-vxzrv\") pod \"route-controller-manager-6576b87f9c-dqxgd\" (UID: \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775411 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54b9e242-469b-450e-b2a1-741d9ee601a1-trusted-ca\") pod \"console-operator-58897d9998-bgqpz\" (UID: \"54b9e242-469b-450e-b2a1-741d9ee601a1\") " pod="openshift-console-operator/console-operator-58897d9998-bgqpz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775432 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83cafff0-05df-46bf-a875-f03ad748a9fc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-27jm2\" (UID: \"83cafff0-05df-46bf-a875-f03ad748a9fc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-27jm2" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775479 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54b9e242-469b-450e-b2a1-741d9ee601a1-config\") pod \"console-operator-58897d9998-bgqpz\" (UID: \"54b9e242-469b-450e-b2a1-741d9ee601a1\") " pod="openshift-console-operator/console-operator-58897d9998-bgqpz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775504 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83cafff0-05df-46bf-a875-f03ad748a9fc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-27jm2\" (UID: \"83cafff0-05df-46bf-a875-f03ad748a9fc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-27jm2" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775526 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9vc8\" (UniqueName: \"kubernetes.io/projected/83cafff0-05df-46bf-a875-f03ad748a9fc-kube-api-access-f9vc8\") pod \"openshift-apiserver-operator-796bbdcf4f-27jm2\" (UID: \"83cafff0-05df-46bf-a875-f03ad748a9fc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-27jm2" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775552 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4zbr\" (UniqueName: \"kubernetes.io/projected/72557843-c8c0-4228-89fc-e735935e10a3-kube-api-access-b4zbr\") pod \"openshift-config-operator-7777fb866f-xrjhk\" (UID: \"72557843-c8c0-4228-89fc-e735935e10a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775574 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775580 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42c55268-c122-4eb0-9508-b5507897990b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-s88th\" (UID: \"42c55268-c122-4eb0-9508-b5507897990b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s88th" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775608 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775646 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d55038e1-9978-48f8-b430-78c7da1ca5e5-serving-cert\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775670 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d55038e1-9978-48f8-b430-78c7da1ca5e5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775691 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be0d6859-aa4c-4a58-97ea-3f3657d4773f-audit-dir\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775719 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqpxb\" (UniqueName: \"kubernetes.io/projected/42c1b38d-39c7-4bfd-ae8e-c8ef6651565f-kube-api-access-xqpxb\") pod \"authentication-operator-69f744f599-ncw52\" (UID: \"42c1b38d-39c7-4bfd-ae8e-c8ef6651565f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775721 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a2834e-159c-47f0-81a8-87d37d89a22a-config\") pod \"machine-api-operator-5694c8668f-5fjmz\" (UID: \"e1a2834e-159c-47f0-81a8-87d37d89a22a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5fjmz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775748 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775774 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/68f674b8-b7c3-43e8-b132-7d6b881cbd31-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-twhnr\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775800 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d55038e1-9978-48f8-b430-78c7da1ca5e5-encryption-config\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.775824 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72557843-c8c0-4228-89fc-e735935e10a3-serving-cert\") pod \"openshift-config-operator-7777fb866f-xrjhk\" (UID: \"72557843-c8c0-4228-89fc-e735935e10a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776135 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/be0d6859-aa4c-4a58-97ea-3f3657d4773f-etcd-client\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776175 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776205 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmg2m\" (UniqueName: \"kubernetes.io/projected/e1a2834e-159c-47f0-81a8-87d37d89a22a-kube-api-access-gmg2m\") pod \"machine-api-operator-5694c8668f-5fjmz\" (UID: \"e1a2834e-159c-47f0-81a8-87d37d89a22a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5fjmz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776230 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d55038e1-9978-48f8-b430-78c7da1ca5e5-etcd-serving-ca\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776252 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d55038e1-9978-48f8-b430-78c7da1ca5e5-audit-dir\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776297 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1a2834e-159c-47f0-81a8-87d37d89a22a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5fjmz\" (UID: \"e1a2834e-159c-47f0-81a8-87d37d89a22a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5fjmz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776352 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a3a6d07-7d02-4d47-836e-75a930433d87-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-89cwl\" (UID: \"2a3a6d07-7d02-4d47-836e-75a930433d87\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776380 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-audit-policies\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776408 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-trusted-ca-bundle\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776540 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56428379-949d-4ba8-9b32-8ee7432abba7-config\") pod \"machine-approver-56656f9798-xklhw\" (UID: \"56428379-949d-4ba8-9b32-8ee7432abba7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776578 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tq2g\" (UniqueName: \"kubernetes.io/projected/54b9e242-469b-450e-b2a1-741d9ee601a1-kube-api-access-2tq2g\") pod \"console-operator-58897d9998-bgqpz\" (UID: \"54b9e242-469b-450e-b2a1-741d9ee601a1\") " pod="openshift-console-operator/console-operator-58897d9998-bgqpz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776608 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d55038e1-9978-48f8-b430-78c7da1ca5e5-config\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776634 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be0d6859-aa4c-4a58-97ea-3f3657d4773f-serving-cert\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776657 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/be0d6859-aa4c-4a58-97ea-3f3657d4773f-encryption-config\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776684 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42c1b38d-39c7-4bfd-ae8e-c8ef6651565f-serving-cert\") pod \"authentication-operator-69f744f599-ncw52\" (UID: \"42c1b38d-39c7-4bfd-ae8e-c8ef6651565f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776709 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r66w7\" (UniqueName: \"kubernetes.io/projected/42c55268-c122-4eb0-9508-b5507897990b-kube-api-access-r66w7\") pod \"openshift-controller-manager-operator-756b6f6bc6-s88th\" (UID: \"42c55268-c122-4eb0-9508-b5507897990b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s88th" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776736 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwtqh\" (UniqueName: \"kubernetes.io/projected/4e5a6b47-360c-4b64-9ba3-15edeb2006fa-kube-api-access-hwtqh\") pod \"cluster-samples-operator-665b6dd947-v798k\" (UID: \"4e5a6b47-360c-4b64-9ba3-15edeb2006fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v798k" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776769 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68f674b8-b7c3-43e8-b132-7d6b881cbd31-serving-cert\") pod \"controller-manager-879f6c89f-twhnr\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776775 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42c1b38d-39c7-4bfd-ae8e-c8ef6651565f-config\") pod \"authentication-operator-69f744f599-ncw52\" (UID: \"42c1b38d-39c7-4bfd-ae8e-c8ef6651565f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776794 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be0d6859-aa4c-4a58-97ea-3f3657d4773f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776823 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq72f\" (UniqueName: \"kubernetes.io/projected/2a3a6d07-7d02-4d47-836e-75a930433d87-kube-api-access-kq72f\") pod \"cluster-image-registry-operator-dc59b4c8b-89cwl\" (UID: \"2a3a6d07-7d02-4d47-836e-75a930433d87\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776849 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/56428379-949d-4ba8-9b32-8ee7432abba7-machine-approver-tls\") pod \"machine-approver-56656f9798-xklhw\" (UID: \"56428379-949d-4ba8-9b32-8ee7432abba7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776876 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-oauth-serving-cert\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.776921 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsgxd\" (UniqueName: \"kubernetes.io/projected/c21af7b0-6f27-43de-8c44-6e6519262019-kube-api-access-tsgxd\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.777232 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68f674b8-b7c3-43e8-b132-7d6b881cbd31-client-ca\") pod \"controller-manager-879f6c89f-twhnr\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.777704 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/be0d6859-aa4c-4a58-97ea-3f3657d4773f-audit-policies\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.778137 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d55038e1-9978-48f8-b430-78c7da1ca5e5-image-import-ca\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.778500 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42c1b38d-39c7-4bfd-ae8e-c8ef6651565f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ncw52\" (UID: \"42c1b38d-39c7-4bfd-ae8e-c8ef6651565f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.778895 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68f674b8-b7c3-43e8-b132-7d6b881cbd31-config\") pod \"controller-manager-879f6c89f-twhnr\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.779206 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.779365 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2a3a6d07-7d02-4d47-836e-75a930433d87-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-89cwl\" (UID: \"2a3a6d07-7d02-4d47-836e-75a930433d87\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.779710 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d55038e1-9978-48f8-b430-78c7da1ca5e5-node-pullsecrets\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.779973 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.780108 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54b9e242-469b-450e-b2a1-741d9ee601a1-config\") pod \"console-operator-58897d9998-bgqpz\" (UID: \"54b9e242-469b-450e-b2a1-741d9ee601a1\") " pod="openshift-console-operator/console-operator-58897d9998-bgqpz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.780232 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d55038e1-9978-48f8-b430-78c7da1ca5e5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.780321 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-client-ca\") pod \"route-controller-manager-6576b87f9c-dqxgd\" (UID: \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.780043 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be0d6859-aa4c-4a58-97ea-3f3657d4773f-audit-dir\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.780329 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/72557843-c8c0-4228-89fc-e735935e10a3-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xrjhk\" (UID: \"72557843-c8c0-4228-89fc-e735935e10a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.780401 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.780460 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d55038e1-9978-48f8-b430-78c7da1ca5e5-config\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.780529 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.780676 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83cafff0-05df-46bf-a875-f03ad748a9fc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-27jm2\" (UID: \"83cafff0-05df-46bf-a875-f03ad748a9fc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-27jm2" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.781295 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.781983 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.782908 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d55038e1-9978-48f8-b430-78c7da1ca5e5-audit-dir\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.783422 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72557843-c8c0-4228-89fc-e735935e10a3-serving-cert\") pod \"openshift-config-operator-7777fb866f-xrjhk\" (UID: \"72557843-c8c0-4228-89fc-e735935e10a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.783442 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a3a6d07-7d02-4d47-836e-75a930433d87-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-89cwl\" (UID: \"2a3a6d07-7d02-4d47-836e-75a930433d87\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.783672 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-serving-cert\") pod \"route-controller-manager-6576b87f9c-dqxgd\" (UID: \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.783695 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d55038e1-9978-48f8-b430-78c7da1ca5e5-etcd-serving-ca\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.783816 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d55038e1-9978-48f8-b430-78c7da1ca5e5-serving-cert\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.784143 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be0d6859-aa4c-4a58-97ea-3f3657d4773f-serving-cert\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.784645 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-2nrns"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.784719 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83cafff0-05df-46bf-a875-f03ad748a9fc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-27jm2\" (UID: \"83cafff0-05df-46bf-a875-f03ad748a9fc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-27jm2" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.784921 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d55038e1-9978-48f8-b430-78c7da1ca5e5-encryption-config\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.785067 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42c1b38d-39c7-4bfd-ae8e-c8ef6651565f-serving-cert\") pod \"authentication-operator-69f744f599-ncw52\" (UID: \"42c1b38d-39c7-4bfd-ae8e-c8ef6651565f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.785510 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2nrns" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.785744 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-t98bs"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.786179 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/be0d6859-aa4c-4a58-97ea-3f3657d4773f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.786575 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42c55268-c122-4eb0-9508-b5507897990b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-s88th\" (UID: \"42c55268-c122-4eb0-9508-b5507897990b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s88th" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.786950 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.787694 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ms9lk"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.788753 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42c55268-c122-4eb0-9508-b5507897990b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-s88th\" (UID: \"42c55268-c122-4eb0-9508-b5507897990b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s88th" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.789121 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54b9e242-469b-450e-b2a1-741d9ee601a1-serving-cert\") pod \"console-operator-58897d9998-bgqpz\" (UID: \"54b9e242-469b-450e-b2a1-741d9ee601a1\") " pod="openshift-console-operator/console-operator-58897d9998-bgqpz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.789233 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.789350 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be0d6859-aa4c-4a58-97ea-3f3657d4773f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.789546 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42c1b38d-39c7-4bfd-ae8e-c8ef6651565f-service-ca-bundle\") pod \"authentication-operator-69f744f599-ncw52\" (UID: \"42c1b38d-39c7-4bfd-ae8e-c8ef6651565f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.790097 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/68f674b8-b7c3-43e8-b132-7d6b881cbd31-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-twhnr\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.790087 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e1a2834e-159c-47f0-81a8-87d37d89a22a-images\") pod \"machine-api-operator-5694c8668f-5fjmz\" (UID: \"e1a2834e-159c-47f0-81a8-87d37d89a22a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5fjmz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.790391 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54b9e242-469b-450e-b2a1-741d9ee601a1-trusted-ca\") pod \"console-operator-58897d9998-bgqpz\" (UID: \"54b9e242-469b-450e-b2a1-741d9ee601a1\") " pod="openshift-console-operator/console-operator-58897d9998-bgqpz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.790431 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nxkgw"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.791630 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ql6wt"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.793164 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5fjmz"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.793245 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d55038e1-9978-48f8-b430-78c7da1ca5e5-etcd-client\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.793494 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1a2834e-159c-47f0-81a8-87d37d89a22a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5fjmz\" (UID: \"e1a2834e-159c-47f0-81a8-87d37d89a22a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5fjmz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.793505 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/be0d6859-aa4c-4a58-97ea-3f3657d4773f-etcd-client\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.793792 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/be0d6859-aa4c-4a58-97ea-3f3657d4773f-encryption-config\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.794144 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68f674b8-b7c3-43e8-b132-7d6b881cbd31-serving-cert\") pod \"controller-manager-879f6c89f-twhnr\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.794415 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v798k"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.795536 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-fc2jr"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.796481 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.797693 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2tkrk"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.799558 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bgqpz"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.799618 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b5vwq"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.800429 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-d9jzf"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.801387 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9gmd5"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.802322 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2r426"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.803267 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rpxqq"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.804184 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.805118 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.806222 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5bmtw"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.809521 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pw4wv"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.810098 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.811975 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sxl49"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.815820 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7v5ck"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.818113 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7v5ck" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.818734 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6kp7p"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.819687 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6kp7p" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.820750 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-58rfh"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.822316 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ncw52"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.824112 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zmgzc"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.825571 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rfplx"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.827391 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.829235 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.829645 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.830438 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-t668z"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.832085 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.833142 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-54js6"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.834178 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7v5ck"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.835260 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.836518 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6kp7p"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.837606 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-t98bs"] Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.849839 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.870394 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.878595 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.878642 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35f10989-95d7-4557-9ba3-b2e966a63938-config\") pod \"kube-apiserver-operator-766d6c64bb-rpxqq\" (UID: \"35f10989-95d7-4557-9ba3-b2e966a63938\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rpxqq" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.878668 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e62a9230-6690-4db9-a808-60b544283182-cert\") pod \"ingress-canary-7v5ck\" (UID: \"e62a9230-6690-4db9-a808-60b544283182\") " pod="openshift-ingress-canary/ingress-canary-7v5ck" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.878697 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56428379-949d-4ba8-9b32-8ee7432abba7-config\") pod \"machine-approver-56656f9798-xklhw\" (UID: \"56428379-949d-4ba8-9b32-8ee7432abba7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.878721 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49286131-fb39-4261-bc4f-db68474c8fa0-srv-cert\") pod \"olm-operator-6b444d44fb-74649\" (UID: \"49286131-fb39-4261-bc4f-db68474c8fa0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.878742 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/56428379-949d-4ba8-9b32-8ee7432abba7-machine-approver-tls\") pod \"machine-approver-56656f9798-xklhw\" (UID: \"56428379-949d-4ba8-9b32-8ee7432abba7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.878762 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d-certs\") pod \"machine-config-server-2nrns\" (UID: \"fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d\") " pod="openshift-machine-config-operator/machine-config-server-2nrns" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.878821 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.878854 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwl9g\" (UniqueName: \"kubernetes.io/projected/2b5f39f2-f4e2-4306-b64c-669ca82f8869-kube-api-access-lwl9g\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.878882 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbvfx\" (UniqueName: \"kubernetes.io/projected/49286131-fb39-4261-bc4f-db68474c8fa0-kube-api-access-wbvfx\") pod \"olm-operator-6b444d44fb-74649\" (UID: \"49286131-fb39-4261-bc4f-db68474c8fa0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.878937 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-console-config\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.878966 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d-node-bootstrap-token\") pod \"machine-config-server-2nrns\" (UID: \"fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d\") " pod="openshift-machine-config-operator/machine-config-server-2nrns" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.879001 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e45773ce-d026-4240-bde4-17339b57ef93-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-54js6\" (UID: \"e45773ce-d026-4240-bde4-17339b57ef93\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-54js6" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.879032 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c21af7b0-6f27-43de-8c44-6e6519262019-console-serving-cert\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.879052 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4e5a6b47-360c-4b64-9ba3-15edeb2006fa-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-v798k\" (UID: \"4e5a6b47-360c-4b64-9ba3-15edeb2006fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v798k" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.879072 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df1b126-33ca-4ad8-911b-44c4f1457b99-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-c58rz\" (UID: \"2df1b126-33ca-4ad8-911b-44c4f1457b99\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.879091 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j64p5\" (UniqueName: \"kubernetes.io/projected/fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d-kube-api-access-j64p5\") pod \"machine-config-server-2nrns\" (UID: \"fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d\") " pod="openshift-machine-config-operator/machine-config-server-2nrns" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.879111 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24ba18d0-a989-4a98-99d3-bbdb42ce4bb9-config\") pod \"service-ca-operator-777779d784-9gmd5\" (UID: \"24ba18d0-a989-4a98-99d3-bbdb42ce4bb9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9gmd5" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.879135 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/40f1d8c0-d195-457c-909e-10fd294a0bfc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-d9jzf\" (UID: \"40f1d8c0-d195-457c-909e-10fd294a0bfc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-d9jzf" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.879166 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df95f79a-fd5f-4a8d-a062-03ce319dc30b-srv-cert\") pod \"catalog-operator-68c6474976-lj8mv\" (UID: \"df95f79a-fd5f-4a8d-a062-03ce319dc30b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.879190 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b6576b0-b9f9-4599-8876-c9b7b0a60a43-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b5vwq\" (UID: \"4b6576b0-b9f9-4599-8876-c9b7b0a60a43\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b5vwq" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.879281 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56428379-949d-4ba8-9b32-8ee7432abba7-config\") pod \"machine-approver-56656f9798-xklhw\" (UID: \"56428379-949d-4ba8-9b32-8ee7432abba7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.879665 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.879697 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-service-ca\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.879720 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.879743 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c21af7b0-6f27-43de-8c44-6e6519262019-console-oauth-config\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.879847 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.879952 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ed9516f-e373-480b-a645-ad35bec98fa4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pw4wv\" (UID: \"9ed9516f-e373-480b-a645-ad35bec98fa4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pw4wv" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880014 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a492f5f7-b613-4e56-8071-78f8c836e7c3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2tkrk\" (UID: \"a492f5f7-b613-4e56-8071-78f8c836e7c3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2tkrk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880041 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdrcf\" (UniqueName: \"kubernetes.io/projected/9ed9516f-e373-480b-a645-ad35bec98fa4-kube-api-access-zdrcf\") pod \"machine-config-controller-84d6567774-pw4wv\" (UID: \"9ed9516f-e373-480b-a645-ad35bec98fa4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pw4wv" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880080 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/11dbd98a-eb9e-4d5f-b52d-df105cbeb83c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zmgzc\" (UID: \"11dbd98a-eb9e-4d5f-b52d-df105cbeb83c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zmgzc" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880105 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880137 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880166 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49286131-fb39-4261-bc4f-db68474c8fa0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-74649\" (UID: \"49286131-fb39-4261-bc4f-db68474c8fa0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880188 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-audit-policies\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880207 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-trusted-ca-bundle\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880248 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp98r\" (UniqueName: \"kubernetes.io/projected/40f1d8c0-d195-457c-909e-10fd294a0bfc-kube-api-access-tp98r\") pod \"multus-admission-controller-857f4d67dd-d9jzf\" (UID: \"40f1d8c0-d195-457c-909e-10fd294a0bfc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-d9jzf" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880272 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2df1b126-33ca-4ad8-911b-44c4f1457b99-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-c58rz\" (UID: \"2df1b126-33ca-4ad8-911b-44c4f1457b99\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880320 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwtqh\" (UniqueName: \"kubernetes.io/projected/4e5a6b47-360c-4b64-9ba3-15edeb2006fa-kube-api-access-hwtqh\") pod \"cluster-samples-operator-665b6dd947-v798k\" (UID: \"4e5a6b47-360c-4b64-9ba3-15edeb2006fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v798k" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880365 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgqkw\" (UniqueName: \"kubernetes.io/projected/b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a-kube-api-access-kgqkw\") pod \"service-ca-9c57cc56f-rfplx\" (UID: \"b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a\") " pod="openshift-service-ca/service-ca-9c57cc56f-rfplx" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880385 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-oauth-serving-cert\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880403 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsgxd\" (UniqueName: \"kubernetes.io/projected/c21af7b0-6f27-43de-8c44-6e6519262019-kube-api-access-tsgxd\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880424 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880446 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24ba18d0-a989-4a98-99d3-bbdb42ce4bb9-serving-cert\") pod \"service-ca-operator-777779d784-9gmd5\" (UID: \"24ba18d0-a989-4a98-99d3-bbdb42ce4bb9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9gmd5" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880474 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e45773ce-d026-4240-bde4-17339b57ef93-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-54js6\" (UID: \"e45773ce-d026-4240-bde4-17339b57ef93\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-54js6" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880499 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a-signing-key\") pod \"service-ca-9c57cc56f-rfplx\" (UID: \"b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a\") " pod="openshift-service-ca/service-ca-9c57cc56f-rfplx" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880541 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qb6v\" (UniqueName: \"kubernetes.io/projected/e62a9230-6690-4db9-a808-60b544283182-kube-api-access-9qb6v\") pod \"ingress-canary-7v5ck\" (UID: \"e62a9230-6690-4db9-a808-60b544283182\") " pod="openshift-ingress-canary/ingress-canary-7v5ck" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880577 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35f10989-95d7-4557-9ba3-b2e966a63938-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rpxqq\" (UID: \"35f10989-95d7-4557-9ba3-b2e966a63938\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rpxqq" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880618 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxqzh\" (UniqueName: \"kubernetes.io/projected/4b6576b0-b9f9-4599-8876-c9b7b0a60a43-kube-api-access-mxqzh\") pod \"package-server-manager-789f6589d5-b5vwq\" (UID: \"4b6576b0-b9f9-4599-8876-c9b7b0a60a43\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b5vwq" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880645 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880669 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880689 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b36ff831-d91c-4350-a36b-bd0625ffb661-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-58rfh\" (UID: \"b36ff831-d91c-4350-a36b-bd0625ffb661\") " pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880708 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcfwq\" (UniqueName: \"kubernetes.io/projected/56428379-949d-4ba8-9b32-8ee7432abba7-kube-api-access-wcfwq\") pod \"machine-approver-56656f9798-xklhw\" (UID: \"56428379-949d-4ba8-9b32-8ee7432abba7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880726 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880753 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/56428379-949d-4ba8-9b32-8ee7432abba7-auth-proxy-config\") pod \"machine-approver-56656f9798-xklhw\" (UID: \"56428379-949d-4ba8-9b32-8ee7432abba7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880771 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdqh5\" (UniqueName: \"kubernetes.io/projected/24ba18d0-a989-4a98-99d3-bbdb42ce4bb9-kube-api-access-vdqh5\") pod \"service-ca-operator-777779d784-9gmd5\" (UID: \"24ba18d0-a989-4a98-99d3-bbdb42ce4bb9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9gmd5" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880787 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11dbd98a-eb9e-4d5f-b52d-df105cbeb83c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zmgzc\" (UID: \"11dbd98a-eb9e-4d5f-b52d-df105cbeb83c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zmgzc" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880804 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j7r9\" (UniqueName: \"kubernetes.io/projected/0eaef920-ee71-48f6-b577-02528a4ec363-kube-api-access-5j7r9\") pod \"migrator-59844c95c7-ql6wt\" (UID: \"0eaef920-ee71-48f6-b577-02528a4ec363\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ql6wt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880820 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a-signing-cabundle\") pod \"service-ca-9c57cc56f-rfplx\" (UID: \"b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a\") " pod="openshift-service-ca/service-ca-9c57cc56f-rfplx" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880835 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwcwm\" (UniqueName: \"kubernetes.io/projected/2df1b126-33ca-4ad8-911b-44c4f1457b99-kube-api-access-rwcwm\") pod \"kube-storage-version-migrator-operator-b67b599dd-c58rz\" (UID: \"2df1b126-33ca-4ad8-911b-44c4f1457b99\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880866 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlsfq\" (UniqueName: \"kubernetes.io/projected/df95f79a-fd5f-4a8d-a062-03ce319dc30b-kube-api-access-qlsfq\") pod \"catalog-operator-68c6474976-lj8mv\" (UID: \"df95f79a-fd5f-4a8d-a062-03ce319dc30b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880883 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b36ff831-d91c-4350-a36b-bd0625ffb661-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-58rfh\" (UID: \"b36ff831-d91c-4350-a36b-bd0625ffb661\") " pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880924 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxr94\" (UniqueName: \"kubernetes.io/projected/edb75c2a-9e6f-4a80-aadd-38416ba9c9a4-kube-api-access-dxr94\") pod \"downloads-7954f5f757-5bmtw\" (UID: \"edb75c2a-9e6f-4a80-aadd-38416ba9c9a4\") " pod="openshift-console/downloads-7954f5f757-5bmtw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880942 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11dbd98a-eb9e-4d5f-b52d-df105cbeb83c-config\") pod \"kube-controller-manager-operator-78b949d7b-zmgzc\" (UID: \"11dbd98a-eb9e-4d5f-b52d-df105cbeb83c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zmgzc" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880976 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b5f39f2-f4e2-4306-b64c-669ca82f8869-audit-dir\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.880994 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65bf6\" (UniqueName: \"kubernetes.io/projected/a492f5f7-b613-4e56-8071-78f8c836e7c3-kube-api-access-65bf6\") pod \"control-plane-machine-set-operator-78cbb6b69f-2tkrk\" (UID: \"a492f5f7-b613-4e56-8071-78f8c836e7c3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2tkrk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.881019 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/df95f79a-fd5f-4a8d-a062-03ce319dc30b-profile-collector-cert\") pod \"catalog-operator-68c6474976-lj8mv\" (UID: \"df95f79a-fd5f-4a8d-a062-03ce319dc30b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.881050 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35f10989-95d7-4557-9ba3-b2e966a63938-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rpxqq\" (UID: \"35f10989-95d7-4557-9ba3-b2e966a63938\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rpxqq" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.881104 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss5ht\" (UniqueName: \"kubernetes.io/projected/b36ff831-d91c-4350-a36b-bd0625ffb661-kube-api-access-ss5ht\") pod \"marketplace-operator-79b997595-58rfh\" (UID: \"b36ff831-d91c-4350-a36b-bd0625ffb661\") " pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.881137 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ed9516f-e373-480b-a645-ad35bec98fa4-proxy-tls\") pod \"machine-config-controller-84d6567774-pw4wv\" (UID: \"9ed9516f-e373-480b-a645-ad35bec98fa4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pw4wv" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.881155 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e45773ce-d026-4240-bde4-17339b57ef93-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-54js6\" (UID: \"e45773ce-d026-4240-bde4-17339b57ef93\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-54js6" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.881295 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-console-config\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.881797 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.881824 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.882301 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.882354 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/56428379-949d-4ba8-9b32-8ee7432abba7-machine-approver-tls\") pod \"machine-approver-56656f9798-xklhw\" (UID: \"56428379-949d-4ba8-9b32-8ee7432abba7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.882640 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b5f39f2-f4e2-4306-b64c-669ca82f8869-audit-dir\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.882752 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.882643 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.883249 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4e5a6b47-360c-4b64-9ba3-15edeb2006fa-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-v798k\" (UID: \"4e5a6b47-360c-4b64-9ba3-15edeb2006fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v798k" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.883536 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/56428379-949d-4ba8-9b32-8ee7432abba7-auth-proxy-config\") pod \"machine-approver-56656f9798-xklhw\" (UID: \"56428379-949d-4ba8-9b32-8ee7432abba7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.883770 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-service-ca\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.883864 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.884143 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-oauth-serving-cert\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.884142 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-audit-policies\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.884675 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-trusted-ca-bundle\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.884753 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c21af7b0-6f27-43de-8c44-6e6519262019-console-oauth-config\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.885584 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.885647 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.885796 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.886127 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.887130 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.887174 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c21af7b0-6f27-43de-8c44-6e6519262019-console-serving-cert\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.889955 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.910052 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.929664 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.949492 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.970083 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982076 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qb6v\" (UniqueName: \"kubernetes.io/projected/e62a9230-6690-4db9-a808-60b544283182-kube-api-access-9qb6v\") pod \"ingress-canary-7v5ck\" (UID: \"e62a9230-6690-4db9-a808-60b544283182\") " pod="openshift-ingress-canary/ingress-canary-7v5ck" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982150 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35f10989-95d7-4557-9ba3-b2e966a63938-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rpxqq\" (UID: \"35f10989-95d7-4557-9ba3-b2e966a63938\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rpxqq" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982193 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxqzh\" (UniqueName: \"kubernetes.io/projected/4b6576b0-b9f9-4599-8876-c9b7b0a60a43-kube-api-access-mxqzh\") pod \"package-server-manager-789f6589d5-b5vwq\" (UID: \"4b6576b0-b9f9-4599-8876-c9b7b0a60a43\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b5vwq" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982228 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b36ff831-d91c-4350-a36b-bd0625ffb661-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-58rfh\" (UID: \"b36ff831-d91c-4350-a36b-bd0625ffb661\") " pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982279 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdqh5\" (UniqueName: \"kubernetes.io/projected/24ba18d0-a989-4a98-99d3-bbdb42ce4bb9-kube-api-access-vdqh5\") pod \"service-ca-operator-777779d784-9gmd5\" (UID: \"24ba18d0-a989-4a98-99d3-bbdb42ce4bb9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9gmd5" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982313 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11dbd98a-eb9e-4d5f-b52d-df105cbeb83c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zmgzc\" (UID: \"11dbd98a-eb9e-4d5f-b52d-df105cbeb83c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zmgzc" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982346 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j7r9\" (UniqueName: \"kubernetes.io/projected/0eaef920-ee71-48f6-b577-02528a4ec363-kube-api-access-5j7r9\") pod \"migrator-59844c95c7-ql6wt\" (UID: \"0eaef920-ee71-48f6-b577-02528a4ec363\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ql6wt" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982381 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwcwm\" (UniqueName: \"kubernetes.io/projected/2df1b126-33ca-4ad8-911b-44c4f1457b99-kube-api-access-rwcwm\") pod \"kube-storage-version-migrator-operator-b67b599dd-c58rz\" (UID: \"2df1b126-33ca-4ad8-911b-44c4f1457b99\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982417 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlsfq\" (UniqueName: \"kubernetes.io/projected/df95f79a-fd5f-4a8d-a062-03ce319dc30b-kube-api-access-qlsfq\") pod \"catalog-operator-68c6474976-lj8mv\" (UID: \"df95f79a-fd5f-4a8d-a062-03ce319dc30b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982447 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a-signing-cabundle\") pod \"service-ca-9c57cc56f-rfplx\" (UID: \"b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a\") " pod="openshift-service-ca/service-ca-9c57cc56f-rfplx" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982480 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b36ff831-d91c-4350-a36b-bd0625ffb661-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-58rfh\" (UID: \"b36ff831-d91c-4350-a36b-bd0625ffb661\") " pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982535 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65bf6\" (UniqueName: \"kubernetes.io/projected/a492f5f7-b613-4e56-8071-78f8c836e7c3-kube-api-access-65bf6\") pod \"control-plane-machine-set-operator-78cbb6b69f-2tkrk\" (UID: \"a492f5f7-b613-4e56-8071-78f8c836e7c3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2tkrk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982567 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11dbd98a-eb9e-4d5f-b52d-df105cbeb83c-config\") pod \"kube-controller-manager-operator-78b949d7b-zmgzc\" (UID: \"11dbd98a-eb9e-4d5f-b52d-df105cbeb83c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zmgzc" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982601 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/df95f79a-fd5f-4a8d-a062-03ce319dc30b-profile-collector-cert\") pod \"catalog-operator-68c6474976-lj8mv\" (UID: \"df95f79a-fd5f-4a8d-a062-03ce319dc30b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982632 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35f10989-95d7-4557-9ba3-b2e966a63938-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rpxqq\" (UID: \"35f10989-95d7-4557-9ba3-b2e966a63938\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rpxqq" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982693 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss5ht\" (UniqueName: \"kubernetes.io/projected/b36ff831-d91c-4350-a36b-bd0625ffb661-kube-api-access-ss5ht\") pod \"marketplace-operator-79b997595-58rfh\" (UID: \"b36ff831-d91c-4350-a36b-bd0625ffb661\") " pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982737 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ed9516f-e373-480b-a645-ad35bec98fa4-proxy-tls\") pod \"machine-config-controller-84d6567774-pw4wv\" (UID: \"9ed9516f-e373-480b-a645-ad35bec98fa4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pw4wv" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982770 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e45773ce-d026-4240-bde4-17339b57ef93-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-54js6\" (UID: \"e45773ce-d026-4240-bde4-17339b57ef93\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-54js6" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982805 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35f10989-95d7-4557-9ba3-b2e966a63938-config\") pod \"kube-apiserver-operator-766d6c64bb-rpxqq\" (UID: \"35f10989-95d7-4557-9ba3-b2e966a63938\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rpxqq" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982837 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e62a9230-6690-4db9-a808-60b544283182-cert\") pod \"ingress-canary-7v5ck\" (UID: \"e62a9230-6690-4db9-a808-60b544283182\") " pod="openshift-ingress-canary/ingress-canary-7v5ck" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982873 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49286131-fb39-4261-bc4f-db68474c8fa0-srv-cert\") pod \"olm-operator-6b444d44fb-74649\" (UID: \"49286131-fb39-4261-bc4f-db68474c8fa0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982942 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d-certs\") pod \"machine-config-server-2nrns\" (UID: \"fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d\") " pod="openshift-machine-config-operator/machine-config-server-2nrns" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.982991 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d-node-bootstrap-token\") pod \"machine-config-server-2nrns\" (UID: \"fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d\") " pod="openshift-machine-config-operator/machine-config-server-2nrns" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983024 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbvfx\" (UniqueName: \"kubernetes.io/projected/49286131-fb39-4261-bc4f-db68474c8fa0-kube-api-access-wbvfx\") pod \"olm-operator-6b444d44fb-74649\" (UID: \"49286131-fb39-4261-bc4f-db68474c8fa0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983056 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e45773ce-d026-4240-bde4-17339b57ef93-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-54js6\" (UID: \"e45773ce-d026-4240-bde4-17339b57ef93\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-54js6" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983102 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df1b126-33ca-4ad8-911b-44c4f1457b99-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-c58rz\" (UID: \"2df1b126-33ca-4ad8-911b-44c4f1457b99\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983135 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j64p5\" (UniqueName: \"kubernetes.io/projected/fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d-kube-api-access-j64p5\") pod \"machine-config-server-2nrns\" (UID: \"fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d\") " pod="openshift-machine-config-operator/machine-config-server-2nrns" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983165 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24ba18d0-a989-4a98-99d3-bbdb42ce4bb9-config\") pod \"service-ca-operator-777779d784-9gmd5\" (UID: \"24ba18d0-a989-4a98-99d3-bbdb42ce4bb9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9gmd5" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983194 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/40f1d8c0-d195-457c-909e-10fd294a0bfc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-d9jzf\" (UID: \"40f1d8c0-d195-457c-909e-10fd294a0bfc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-d9jzf" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983234 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df95f79a-fd5f-4a8d-a062-03ce319dc30b-srv-cert\") pod \"catalog-operator-68c6474976-lj8mv\" (UID: \"df95f79a-fd5f-4a8d-a062-03ce319dc30b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983267 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b6576b0-b9f9-4599-8876-c9b7b0a60a43-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b5vwq\" (UID: \"4b6576b0-b9f9-4599-8876-c9b7b0a60a43\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b5vwq" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983384 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ed9516f-e373-480b-a645-ad35bec98fa4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pw4wv\" (UID: \"9ed9516f-e373-480b-a645-ad35bec98fa4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pw4wv" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983432 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a492f5f7-b613-4e56-8071-78f8c836e7c3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2tkrk\" (UID: \"a492f5f7-b613-4e56-8071-78f8c836e7c3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2tkrk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983465 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdrcf\" (UniqueName: \"kubernetes.io/projected/9ed9516f-e373-480b-a645-ad35bec98fa4-kube-api-access-zdrcf\") pod \"machine-config-controller-84d6567774-pw4wv\" (UID: \"9ed9516f-e373-480b-a645-ad35bec98fa4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pw4wv" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983507 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/11dbd98a-eb9e-4d5f-b52d-df105cbeb83c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zmgzc\" (UID: \"11dbd98a-eb9e-4d5f-b52d-df105cbeb83c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zmgzc" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983562 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49286131-fb39-4261-bc4f-db68474c8fa0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-74649\" (UID: \"49286131-fb39-4261-bc4f-db68474c8fa0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983614 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp98r\" (UniqueName: \"kubernetes.io/projected/40f1d8c0-d195-457c-909e-10fd294a0bfc-kube-api-access-tp98r\") pod \"multus-admission-controller-857f4d67dd-d9jzf\" (UID: \"40f1d8c0-d195-457c-909e-10fd294a0bfc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-d9jzf" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983645 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2df1b126-33ca-4ad8-911b-44c4f1457b99-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-c58rz\" (UID: \"2df1b126-33ca-4ad8-911b-44c4f1457b99\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983710 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgqkw\" (UniqueName: \"kubernetes.io/projected/b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a-kube-api-access-kgqkw\") pod \"service-ca-9c57cc56f-rfplx\" (UID: \"b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a\") " pod="openshift-service-ca/service-ca-9c57cc56f-rfplx" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983763 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24ba18d0-a989-4a98-99d3-bbdb42ce4bb9-serving-cert\") pod \"service-ca-operator-777779d784-9gmd5\" (UID: \"24ba18d0-a989-4a98-99d3-bbdb42ce4bb9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9gmd5" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983793 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e45773ce-d026-4240-bde4-17339b57ef93-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-54js6\" (UID: \"e45773ce-d026-4240-bde4-17339b57ef93\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-54js6" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.983822 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a-signing-key\") pod \"service-ca-9c57cc56f-rfplx\" (UID: \"b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a\") " pod="openshift-service-ca/service-ca-9c57cc56f-rfplx" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.985102 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35f10989-95d7-4557-9ba3-b2e966a63938-config\") pod \"kube-apiserver-operator-766d6c64bb-rpxqq\" (UID: \"35f10989-95d7-4557-9ba3-b2e966a63938\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rpxqq" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.985327 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11dbd98a-eb9e-4d5f-b52d-df105cbeb83c-config\") pod \"kube-controller-manager-operator-78b949d7b-zmgzc\" (UID: \"11dbd98a-eb9e-4d5f-b52d-df105cbeb83c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zmgzc" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.986432 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ed9516f-e373-480b-a645-ad35bec98fa4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pw4wv\" (UID: \"9ed9516f-e373-480b-a645-ad35bec98fa4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pw4wv" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.986473 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11dbd98a-eb9e-4d5f-b52d-df105cbeb83c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zmgzc\" (UID: \"11dbd98a-eb9e-4d5f-b52d-df105cbeb83c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zmgzc" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.987478 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35f10989-95d7-4557-9ba3-b2e966a63938-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rpxqq\" (UID: \"35f10989-95d7-4557-9ba3-b2e966a63938\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rpxqq" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.988794 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a492f5f7-b613-4e56-8071-78f8c836e7c3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2tkrk\" (UID: \"a492f5f7-b613-4e56-8071-78f8c836e7c3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2tkrk" Dec 16 14:57:10 crc kubenswrapper[4775]: I1216 14:57:10.989615 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.009644 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.029958 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.037225 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a-signing-key\") pod \"service-ca-9c57cc56f-rfplx\" (UID: \"b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a\") " pod="openshift-service-ca/service-ca-9c57cc56f-rfplx" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.049338 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.055004 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a-signing-cabundle\") pod \"service-ca-9c57cc56f-rfplx\" (UID: \"b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a\") " pod="openshift-service-ca/service-ca-9c57cc56f-rfplx" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.069480 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.089722 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.116038 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.130201 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.149087 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.169565 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.193142 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.198617 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49286131-fb39-4261-bc4f-db68474c8fa0-srv-cert\") pod \"olm-operator-6b444d44fb-74649\" (UID: \"49286131-fb39-4261-bc4f-db68474c8fa0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.209411 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.218849 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49286131-fb39-4261-bc4f-db68474c8fa0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-74649\" (UID: \"49286131-fb39-4261-bc4f-db68474c8fa0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.219445 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/df95f79a-fd5f-4a8d-a062-03ce319dc30b-profile-collector-cert\") pod \"catalog-operator-68c6474976-lj8mv\" (UID: \"df95f79a-fd5f-4a8d-a062-03ce319dc30b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.229708 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.248996 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.257200 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/40f1d8c0-d195-457c-909e-10fd294a0bfc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-d9jzf\" (UID: \"40f1d8c0-d195-457c-909e-10fd294a0bfc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-d9jzf" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.269624 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.289560 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.297820 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b6576b0-b9f9-4599-8876-c9b7b0a60a43-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b5vwq\" (UID: \"4b6576b0-b9f9-4599-8876-c9b7b0a60a43\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b5vwq" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.310480 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.330267 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.350276 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.370448 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.390135 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.411053 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.429761 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.451016 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.470392 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.480437 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ed9516f-e373-480b-a645-ad35bec98fa4-proxy-tls\") pod \"machine-config-controller-84d6567774-pw4wv\" (UID: \"9ed9516f-e373-480b-a645-ad35bec98fa4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pw4wv" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.491411 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.517696 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.524871 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b36ff831-d91c-4350-a36b-bd0625ffb661-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-58rfh\" (UID: \"b36ff831-d91c-4350-a36b-bd0625ffb661\") " pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.530530 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.550546 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.560081 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b36ff831-d91c-4350-a36b-bd0625ffb661-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-58rfh\" (UID: \"b36ff831-d91c-4350-a36b-bd0625ffb661\") " pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.569871 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.589762 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.610555 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.631148 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.650607 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.670503 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.678384 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e45773ce-d026-4240-bde4-17339b57ef93-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-54js6\" (UID: \"e45773ce-d026-4240-bde4-17339b57ef93\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-54js6" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.690117 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.709263 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.715656 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e45773ce-d026-4240-bde4-17339b57ef93-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-54js6\" (UID: \"e45773ce-d026-4240-bde4-17339b57ef93\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-54js6" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.729329 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.749441 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.758252 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24ba18d0-a989-4a98-99d3-bbdb42ce4bb9-serving-cert\") pod \"service-ca-operator-777779d784-9gmd5\" (UID: \"24ba18d0-a989-4a98-99d3-bbdb42ce4bb9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9gmd5" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.770295 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.787865 4775 request.go:700] Waited for 1.016604902s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.790004 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.810806 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.814994 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24ba18d0-a989-4a98-99d3-bbdb42ce4bb9-config\") pod \"service-ca-operator-777779d784-9gmd5\" (UID: \"24ba18d0-a989-4a98-99d3-bbdb42ce4bb9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9gmd5" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.830767 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.849407 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.871306 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.890920 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.910570 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.930034 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.950220 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.969738 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 16 14:57:11 crc kubenswrapper[4775]: E1216 14:57:11.984383 4775 configmap.go:193] Couldn't get configMap openshift-kube-storage-version-migrator-operator/config: failed to sync configmap cache: timed out waiting for the condition Dec 16 14:57:11 crc kubenswrapper[4775]: E1216 14:57:11.984424 4775 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Dec 16 14:57:11 crc kubenswrapper[4775]: E1216 14:57:11.984469 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2df1b126-33ca-4ad8-911b-44c4f1457b99-config podName:2df1b126-33ca-4ad8-911b-44c4f1457b99 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:12.484445612 +0000 UTC m=+157.435524545 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/2df1b126-33ca-4ad8-911b-44c4f1457b99-config") pod "kube-storage-version-migrator-operator-b67b599dd-c58rz" (UID: "2df1b126-33ca-4ad8-911b-44c4f1457b99") : failed to sync configmap cache: timed out waiting for the condition Dec 16 14:57:11 crc kubenswrapper[4775]: E1216 14:57:11.984504 4775 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Dec 16 14:57:11 crc kubenswrapper[4775]: E1216 14:57:11.984534 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d-certs podName:fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d nodeName:}" failed. No retries permitted until 2025-12-16 14:57:12.484505264 +0000 UTC m=+157.435584217 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d-certs") pod "machine-config-server-2nrns" (UID: "fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d") : failed to sync secret cache: timed out waiting for the condition Dec 16 14:57:11 crc kubenswrapper[4775]: E1216 14:57:11.984566 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d-node-bootstrap-token podName:fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d nodeName:}" failed. No retries permitted until 2025-12-16 14:57:12.484552116 +0000 UTC m=+157.435631079 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d-node-bootstrap-token") pod "machine-config-server-2nrns" (UID: "fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d") : failed to sync secret cache: timed out waiting for the condition Dec 16 14:57:11 crc kubenswrapper[4775]: E1216 14:57:11.984597 4775 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 16 14:57:11 crc kubenswrapper[4775]: E1216 14:57:11.984635 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df95f79a-fd5f-4a8d-a062-03ce319dc30b-srv-cert podName:df95f79a-fd5f-4a8d-a062-03ce319dc30b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:12.484625498 +0000 UTC m=+157.435704461 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/df95f79a-fd5f-4a8d-a062-03ce319dc30b-srv-cert") pod "catalog-operator-68c6474976-lj8mv" (UID: "df95f79a-fd5f-4a8d-a062-03ce319dc30b") : failed to sync secret cache: timed out waiting for the condition Dec 16 14:57:11 crc kubenswrapper[4775]: E1216 14:57:11.984653 4775 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 16 14:57:11 crc kubenswrapper[4775]: E1216 14:57:11.984686 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e62a9230-6690-4db9-a808-60b544283182-cert podName:e62a9230-6690-4db9-a808-60b544283182 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:12.4846772 +0000 UTC m=+157.435756133 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e62a9230-6690-4db9-a808-60b544283182-cert") pod "ingress-canary-7v5ck" (UID: "e62a9230-6690-4db9-a808-60b544283182") : failed to sync secret cache: timed out waiting for the condition Dec 16 14:57:11 crc kubenswrapper[4775]: E1216 14:57:11.985054 4775 secret.go:188] Couldn't get secret openshift-kube-storage-version-migrator-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 16 14:57:11 crc kubenswrapper[4775]: E1216 14:57:11.985124 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2df1b126-33ca-4ad8-911b-44c4f1457b99-serving-cert podName:2df1b126-33ca-4ad8-911b-44c4f1457b99 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:12.485110153 +0000 UTC m=+157.436189096 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2df1b126-33ca-4ad8-911b-44c4f1457b99-serving-cert") pod "kube-storage-version-migrator-operator-b67b599dd-c58rz" (UID: "2df1b126-33ca-4ad8-911b-44c4f1457b99") : failed to sync secret cache: timed out waiting for the condition Dec 16 14:57:11 crc kubenswrapper[4775]: I1216 14:57:11.990731 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.009809 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.030732 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.050057 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.071695 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.091412 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.110519 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.156020 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zq42\" (UniqueName: \"kubernetes.io/projected/68f674b8-b7c3-43e8-b132-7d6b881cbd31-kube-api-access-9zq42\") pod \"controller-manager-879f6c89f-twhnr\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.164179 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a3a6d07-7d02-4d47-836e-75a930433d87-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-89cwl\" (UID: \"2a3a6d07-7d02-4d47-836e-75a930433d87\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.185571 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxzrv\" (UniqueName: \"kubernetes.io/projected/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-kube-api-access-vxzrv\") pod \"route-controller-manager-6576b87f9c-dqxgd\" (UID: \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.202015 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.211870 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tq2g\" (UniqueName: \"kubernetes.io/projected/54b9e242-469b-450e-b2a1-741d9ee601a1-kube-api-access-2tq2g\") pod \"console-operator-58897d9998-bgqpz\" (UID: \"54b9e242-469b-450e-b2a1-741d9ee601a1\") " pod="openshift-console-operator/console-operator-58897d9998-bgqpz" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.225875 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-974sg\" (UniqueName: \"kubernetes.io/projected/d55038e1-9978-48f8-b430-78c7da1ca5e5-kube-api-access-974sg\") pod \"apiserver-76f77b778f-nxkgw\" (UID: \"d55038e1-9978-48f8-b430-78c7da1ca5e5\") " pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.229921 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.241047 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.263350 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bgqpz" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.270756 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq72f\" (UniqueName: \"kubernetes.io/projected/2a3a6d07-7d02-4d47-836e-75a930433d87-kube-api-access-kq72f\") pod \"cluster-image-registry-operator-dc59b4c8b-89cwl\" (UID: \"2a3a6d07-7d02-4d47-836e-75a930433d87\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.273249 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.288379 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4zbr\" (UniqueName: \"kubernetes.io/projected/72557843-c8c0-4228-89fc-e735935e10a3-kube-api-access-b4zbr\") pod \"openshift-config-operator-7777fb866f-xrjhk\" (UID: \"72557843-c8c0-4228-89fc-e735935e10a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.310271 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.312671 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9vc8\" (UniqueName: \"kubernetes.io/projected/83cafff0-05df-46bf-a875-f03ad748a9fc-kube-api-access-f9vc8\") pod \"openshift-apiserver-operator-796bbdcf4f-27jm2\" (UID: \"83cafff0-05df-46bf-a875-f03ad748a9fc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-27jm2" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.330948 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.353593 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.371538 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.390493 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.399651 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.411275 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd"] Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.424675 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqpxb\" (UniqueName: \"kubernetes.io/projected/42c1b38d-39c7-4bfd-ae8e-c8ef6651565f-kube-api-access-xqpxb\") pod \"authentication-operator-69f744f599-ncw52\" (UID: \"42c1b38d-39c7-4bfd-ae8e-c8ef6651565f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.425573 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.454319 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.458825 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r66w7\" (UniqueName: \"kubernetes.io/projected/42c55268-c122-4eb0-9508-b5507897990b-kube-api-access-r66w7\") pod \"openshift-controller-manager-operator-756b6f6bc6-s88th\" (UID: \"42c55268-c122-4eb0-9508-b5507897990b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s88th" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.474183 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.490515 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.504993 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-twhnr"] Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.510915 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.516566 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2df1b126-33ca-4ad8-911b-44c4f1457b99-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-c58rz\" (UID: \"2df1b126-33ca-4ad8-911b-44c4f1457b99\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.516726 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e62a9230-6690-4db9-a808-60b544283182-cert\") pod \"ingress-canary-7v5ck\" (UID: \"e62a9230-6690-4db9-a808-60b544283182\") " pod="openshift-ingress-canary/ingress-canary-7v5ck" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.516748 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d-certs\") pod \"machine-config-server-2nrns\" (UID: \"fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d\") " pod="openshift-machine-config-operator/machine-config-server-2nrns" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.516773 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d-node-bootstrap-token\") pod \"machine-config-server-2nrns\" (UID: \"fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d\") " pod="openshift-machine-config-operator/machine-config-server-2nrns" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.516800 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df1b126-33ca-4ad8-911b-44c4f1457b99-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-c58rz\" (UID: \"2df1b126-33ca-4ad8-911b-44c4f1457b99\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.517196 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df95f79a-fd5f-4a8d-a062-03ce319dc30b-srv-cert\") pod \"catalog-operator-68c6474976-lj8mv\" (UID: \"df95f79a-fd5f-4a8d-a062-03ce319dc30b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.517973 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df1b126-33ca-4ad8-911b-44c4f1457b99-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-c58rz\" (UID: \"2df1b126-33ca-4ad8-911b-44c4f1457b99\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.518215 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-27jm2" Dec 16 14:57:12 crc kubenswrapper[4775]: W1216 14:57:12.525730 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68f674b8_b7c3_43e8_b132_7d6b881cbd31.slice/crio-018ce0de313ae57dd0abfbbe0adc33e46bf4b6593bc70f3aeb449e507c9d0080 WatchSource:0}: Error finding container 018ce0de313ae57dd0abfbbe0adc33e46bf4b6593bc70f3aeb449e507c9d0080: Status 404 returned error can't find the container with id 018ce0de313ae57dd0abfbbe0adc33e46bf4b6593bc70f3aeb449e507c9d0080 Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.526073 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df95f79a-fd5f-4a8d-a062-03ce319dc30b-srv-cert\") pod \"catalog-operator-68c6474976-lj8mv\" (UID: \"df95f79a-fd5f-4a8d-a062-03ce319dc30b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.526544 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2df1b126-33ca-4ad8-911b-44c4f1457b99-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-c58rz\" (UID: \"2df1b126-33ca-4ad8-911b-44c4f1457b99\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.526824 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.533816 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.544463 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl"] Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.558013 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s88th" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.574351 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bgqpz"] Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.590987 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmg2m\" (UniqueName: \"kubernetes.io/projected/e1a2834e-159c-47f0-81a8-87d37d89a22a-kube-api-access-gmg2m\") pod \"machine-api-operator-5694c8668f-5fjmz\" (UID: \"e1a2834e-159c-47f0-81a8-87d37d89a22a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5fjmz" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.592698 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.605006 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d-certs\") pod \"machine-config-server-2nrns\" (UID: \"fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d\") " pod="openshift-machine-config-operator/machine-config-server-2nrns" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.612238 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.620763 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d-node-bootstrap-token\") pod \"machine-config-server-2nrns\" (UID: \"fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d\") " pod="openshift-machine-config-operator/machine-config-server-2nrns" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.632785 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.641757 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nxkgw"] Dec 16 14:57:12 crc kubenswrapper[4775]: W1216 14:57:12.646392 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54b9e242_469b_450e_b2a1_741d9ee601a1.slice/crio-11b08926f2af0aca4b201605b8a24057821886fc651967a35b93a3b10028b379 WatchSource:0}: Error finding container 11b08926f2af0aca4b201605b8a24057821886fc651967a35b93a3b10028b379: Status 404 returned error can't find the container with id 11b08926f2af0aca4b201605b8a24057821886fc651967a35b93a3b10028b379 Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.666833 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnf49\" (UniqueName: \"kubernetes.io/projected/be0d6859-aa4c-4a58-97ea-3f3657d4773f-kube-api-access-jnf49\") pod \"apiserver-7bbb656c7d-hr98m\" (UID: \"be0d6859-aa4c-4a58-97ea-3f3657d4773f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.673341 4775 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.690331 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.710113 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5fjmz" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.711050 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.730581 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.736426 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.741593 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e62a9230-6690-4db9-a808-60b544283182-cert\") pod \"ingress-canary-7v5ck\" (UID: \"e62a9230-6690-4db9-a808-60b544283182\") " pod="openshift-ingress-canary/ingress-canary-7v5ck" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.750205 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.768246 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ncw52"] Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.770318 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.790129 4775 request.go:700] Waited for 1.971686305s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.791348 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.810171 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.829452 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.836789 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk"] Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.844762 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s88th"] Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.850998 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 16 14:57:12 crc kubenswrapper[4775]: W1216 14:57:12.890131 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42c55268_c122_4eb0_9508_b5507897990b.slice/crio-06b5b3f4673333344ba17b1aaa2525defb0617b71da8800c1dd18575e231aa81 WatchSource:0}: Error finding container 06b5b3f4673333344ba17b1aaa2525defb0617b71da8800c1dd18575e231aa81: Status 404 returned error can't find the container with id 06b5b3f4673333344ba17b1aaa2525defb0617b71da8800c1dd18575e231aa81 Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.918138 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwl9g\" (UniqueName: \"kubernetes.io/projected/2b5f39f2-f4e2-4306-b64c-669ca82f8869-kube-api-access-lwl9g\") pod \"oauth-openshift-558db77b4-ms9lk\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.931202 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-27jm2"] Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.933881 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxr94\" (UniqueName: \"kubernetes.io/projected/edb75c2a-9e6f-4a80-aadd-38416ba9c9a4-kube-api-access-dxr94\") pod \"downloads-7954f5f757-5bmtw\" (UID: \"edb75c2a-9e6f-4a80-aadd-38416ba9c9a4\") " pod="openshift-console/downloads-7954f5f757-5bmtw" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.952333 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsgxd\" (UniqueName: \"kubernetes.io/projected/c21af7b0-6f27-43de-8c44-6e6519262019-kube-api-access-tsgxd\") pod \"console-f9d7485db-fc2jr\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.956481 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5bmtw" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.961756 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.967015 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwtqh\" (UniqueName: \"kubernetes.io/projected/4e5a6b47-360c-4b64-9ba3-15edeb2006fa-kube-api-access-hwtqh\") pod \"cluster-samples-operator-665b6dd947-v798k\" (UID: \"4e5a6b47-360c-4b64-9ba3-15edeb2006fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v798k" Dec 16 14:57:12 crc kubenswrapper[4775]: I1216 14:57:12.990240 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcfwq\" (UniqueName: \"kubernetes.io/projected/56428379-949d-4ba8-9b32-8ee7432abba7-kube-api-access-wcfwq\") pod \"machine-approver-56656f9798-xklhw\" (UID: \"56428379-949d-4ba8-9b32-8ee7432abba7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.019712 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qb6v\" (UniqueName: \"kubernetes.io/projected/e62a9230-6690-4db9-a808-60b544283182-kube-api-access-9qb6v\") pod \"ingress-canary-7v5ck\" (UID: \"e62a9230-6690-4db9-a808-60b544283182\") " pod="openshift-ingress-canary/ingress-canary-7v5ck" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.028686 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxqzh\" (UniqueName: \"kubernetes.io/projected/4b6576b0-b9f9-4599-8876-c9b7b0a60a43-kube-api-access-mxqzh\") pod \"package-server-manager-789f6589d5-b5vwq\" (UID: \"4b6576b0-b9f9-4599-8876-c9b7b0a60a43\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b5vwq" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.044235 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35f10989-95d7-4557-9ba3-b2e966a63938-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rpxqq\" (UID: \"35f10989-95d7-4557-9ba3-b2e966a63938\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rpxqq" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.061594 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b5vwq" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.068844 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdqh5\" (UniqueName: \"kubernetes.io/projected/24ba18d0-a989-4a98-99d3-bbdb42ce4bb9-kube-api-access-vdqh5\") pod \"service-ca-operator-777779d784-9gmd5\" (UID: \"24ba18d0-a989-4a98-99d3-bbdb42ce4bb9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9gmd5" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.090510 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwcwm\" (UniqueName: \"kubernetes.io/projected/2df1b126-33ca-4ad8-911b-44c4f1457b99-kube-api-access-rwcwm\") pod \"kube-storage-version-migrator-operator-b67b599dd-c58rz\" (UID: \"2df1b126-33ca-4ad8-911b-44c4f1457b99\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.112850 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j7r9\" (UniqueName: \"kubernetes.io/projected/0eaef920-ee71-48f6-b577-02528a4ec363-kube-api-access-5j7r9\") pod \"migrator-59844c95c7-ql6wt\" (UID: \"0eaef920-ee71-48f6-b577-02528a4ec363\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ql6wt" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.121377 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9gmd5" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.137463 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlsfq\" (UniqueName: \"kubernetes.io/projected/df95f79a-fd5f-4a8d-a062-03ce319dc30b-kube-api-access-qlsfq\") pod \"catalog-operator-68c6474976-lj8mv\" (UID: \"df95f79a-fd5f-4a8d-a062-03ce319dc30b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.145699 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" event={"ID":"68f674b8-b7c3-43e8-b132-7d6b881cbd31","Type":"ContainerStarted","Data":"5d55f4b497106e7e0dc04a363d9e5a58d6a1d1626b5e6866bc391e04e0d5b7c8"} Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.145757 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" event={"ID":"68f674b8-b7c3-43e8-b132-7d6b881cbd31","Type":"ContainerStarted","Data":"018ce0de313ae57dd0abfbbe0adc33e46bf4b6593bc70f3aeb449e507c9d0080"} Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.145781 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss5ht\" (UniqueName: \"kubernetes.io/projected/b36ff831-d91c-4350-a36b-bd0625ffb661-kube-api-access-ss5ht\") pod \"marketplace-operator-79b997595-58rfh\" (UID: \"b36ff831-d91c-4350-a36b-bd0625ffb661\") " pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.146549 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.149659 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.169193 4775 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-twhnr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.169239 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" podUID="68f674b8-b7c3-43e8-b132-7d6b881cbd31" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.170287 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.172398 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e45773ce-d026-4240-bde4-17339b57ef93-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-54js6\" (UID: \"e45773ce-d026-4240-bde4-17339b57ef93\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-54js6" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.174562 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl" event={"ID":"2a3a6d07-7d02-4d47-836e-75a930433d87","Type":"ContainerStarted","Data":"02e1c7b171cb078fa4959d91c4aa768817f5d54abe9d1eaadad1e0d10db2319e"} Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.174610 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl" event={"ID":"2a3a6d07-7d02-4d47-836e-75a930433d87","Type":"ContainerStarted","Data":"64598d2d2e440c88e30d0c5b28b0899f0ead406e877bd144aff659d98fa565ba"} Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.186816 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s88th" event={"ID":"42c55268-c122-4eb0-9508-b5507897990b","Type":"ContainerStarted","Data":"06b5b3f4673333344ba17b1aaa2525defb0617b71da8800c1dd18575e231aa81"} Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.190208 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" event={"ID":"59b049c3-67e7-4fef-8a8e-b90fb5f75bba","Type":"ContainerStarted","Data":"8ef56132fd6deb57e73e803310ee97a51302419ba6948ce3636f808409968764"} Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.190267 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" event={"ID":"59b049c3-67e7-4fef-8a8e-b90fb5f75bba","Type":"ContainerStarted","Data":"00702ece8e40a76fbc272a32c36598fb69e668d6d3f53164dc399a2855953951"} Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.191409 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.197409 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bgqpz" event={"ID":"54b9e242-469b-450e-b2a1-741d9ee601a1","Type":"ContainerStarted","Data":"95a08dd1e1b6ba4408a069fe3cb7ae5bf125ac2b9897758ec968d6354f180ccf"} Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.197453 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bgqpz" event={"ID":"54b9e242-469b-450e-b2a1-741d9ee601a1","Type":"ContainerStarted","Data":"11b08926f2af0aca4b201605b8a24057821886fc651967a35b93a3b10028b379"} Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.197627 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-bgqpz" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.201216 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7v5ck" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.202298 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65bf6\" (UniqueName: \"kubernetes.io/projected/a492f5f7-b613-4e56-8071-78f8c836e7c3-kube-api-access-65bf6\") pod \"control-plane-machine-set-operator-78cbb6b69f-2tkrk\" (UID: \"a492f5f7-b613-4e56-8071-78f8c836e7c3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2tkrk" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.209814 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbvfx\" (UniqueName: \"kubernetes.io/projected/49286131-fb39-4261-bc4f-db68474c8fa0-kube-api-access-wbvfx\") pod \"olm-operator-6b444d44fb-74649\" (UID: \"49286131-fb39-4261-bc4f-db68474c8fa0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.212584 4775 generic.go:334] "Generic (PLEG): container finished" podID="d55038e1-9978-48f8-b430-78c7da1ca5e5" containerID="316aac07d14db2680e481ac0be6657dca41cd75618e82a38041f6212567014f4" exitCode=0 Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.212641 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" event={"ID":"d55038e1-9978-48f8-b430-78c7da1ca5e5","Type":"ContainerDied","Data":"316aac07d14db2680e481ac0be6657dca41cd75618e82a38041f6212567014f4"} Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.212684 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" event={"ID":"d55038e1-9978-48f8-b430-78c7da1ca5e5","Type":"ContainerStarted","Data":"1a792f59098a1d846085edd3573d7015c6b4c2ad4e74d666dac8315fcaf50619"} Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.217821 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" event={"ID":"42c1b38d-39c7-4bfd-ae8e-c8ef6651565f","Type":"ContainerStarted","Data":"ec0cc32bcf606fa46b6299e56ad7ebe64549c5ebc3db350359dce9a6275eebe4"} Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.217864 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" event={"ID":"42c1b38d-39c7-4bfd-ae8e-c8ef6651565f","Type":"ContainerStarted","Data":"a58cbfac95f0f957c4be1f4e8b38d61e5853719a995c611a026534c20fb4e1b8"} Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.221054 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v798k" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.227846 4775 patch_prober.go:28] interesting pod/console-operator-58897d9998-bgqpz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.227913 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bgqpz" podUID="54b9e242-469b-450e-b2a1-741d9ee601a1" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.233919 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-27jm2" event={"ID":"83cafff0-05df-46bf-a875-f03ad748a9fc","Type":"ContainerStarted","Data":"f7cb17e0a609b055e86684a68e25f9cb7d2d5344a3dd981fa9de73873f18eba1"} Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.240558 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.247481 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.248277 4775 generic.go:334] "Generic (PLEG): container finished" podID="72557843-c8c0-4228-89fc-e735935e10a3" containerID="197cfa344bdca4e3588c34cbd515e84cb62b428e0a68e80307da68634735be17" exitCode=0 Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.249043 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5fjmz"] Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.249072 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk" event={"ID":"72557843-c8c0-4228-89fc-e735935e10a3","Type":"ContainerDied","Data":"197cfa344bdca4e3588c34cbd515e84cb62b428e0a68e80307da68634735be17"} Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.249093 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk" event={"ID":"72557843-c8c0-4228-89fc-e735935e10a3","Type":"ContainerStarted","Data":"cb7b69af770ce550a9243c54a36affb8794a5431aa3105bd9193c7c58bf6e493"} Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.250590 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j64p5\" (UniqueName: \"kubernetes.io/projected/fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d-kube-api-access-j64p5\") pod \"machine-config-server-2nrns\" (UID: \"fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d\") " pod="openshift-machine-config-operator/machine-config-server-2nrns" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.253003 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp98r\" (UniqueName: \"kubernetes.io/projected/40f1d8c0-d195-457c-909e-10fd294a0bfc-kube-api-access-tp98r\") pod \"multus-admission-controller-857f4d67dd-d9jzf\" (UID: \"40f1d8c0-d195-457c-909e-10fd294a0bfc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-d9jzf" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.268679 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5bmtw"] Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.270551 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m"] Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.271126 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgqkw\" (UniqueName: \"kubernetes.io/projected/b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a-kube-api-access-kgqkw\") pod \"service-ca-9c57cc56f-rfplx\" (UID: \"b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a\") " pod="openshift-service-ca/service-ca-9c57cc56f-rfplx" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.277226 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rpxqq" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.288290 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdrcf\" (UniqueName: \"kubernetes.io/projected/9ed9516f-e373-480b-a645-ad35bec98fa4-kube-api-access-zdrcf\") pod \"machine-config-controller-84d6567774-pw4wv\" (UID: \"9ed9516f-e373-480b-a645-ad35bec98fa4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pw4wv" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.299642 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ms9lk"] Dec 16 14:57:13 crc kubenswrapper[4775]: W1216 14:57:13.300809 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe0d6859_aa4c_4a58_97ea_3f3657d4773f.slice/crio-31dd977b8ac18929e81965d718cf97679e0a834b9a2c6d2fecd6c4e75820815c WatchSource:0}: Error finding container 31dd977b8ac18929e81965d718cf97679e0a834b9a2c6d2fecd6c4e75820815c: Status 404 returned error can't find the container with id 31dd977b8ac18929e81965d718cf97679e0a834b9a2c6d2fecd6c4e75820815c Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.313433 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2tkrk" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.313460 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/11dbd98a-eb9e-4d5f-b52d-df105cbeb83c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zmgzc\" (UID: \"11dbd98a-eb9e-4d5f-b52d-df105cbeb83c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zmgzc" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.332265 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zmgzc" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.332245 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rfplx" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.342347 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ql6wt" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354056 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354349 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d72e9f6-db87-4192-8f08-32788a4ad601-service-ca-bundle\") pod \"router-default-5444994796-dw4sb\" (UID: \"5d72e9f6-db87-4192-8f08-32788a4ad601\") " pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354386 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5f57ccda-46a6-46ca-883a-0ae41ed65a07-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wlgbd\" (UID: \"5f57ccda-46a6-46ca-883a-0ae41ed65a07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354423 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56f482f0-f875-48f1-9a10-f06deeb5791e-config-volume\") pod \"dns-default-6kp7p\" (UID: \"56f482f0-f875-48f1-9a10-f06deeb5791e\") " pod="openshift-dns/dns-default-6kp7p" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354462 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a35488ea-bcad-480f-9e21-756619b1ed3b-etcd-service-ca\") pod \"etcd-operator-b45778765-2r426\" (UID: \"a35488ea-bcad-480f-9e21-756619b1ed3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354484 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ks4d\" (UniqueName: \"kubernetes.io/projected/56f482f0-f875-48f1-9a10-f06deeb5791e-kube-api-access-2ks4d\") pod \"dns-default-6kp7p\" (UID: \"56f482f0-f875-48f1-9a10-f06deeb5791e\") " pod="openshift-dns/dns-default-6kp7p" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354525 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354602 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5f57ccda-46a6-46ca-883a-0ae41ed65a07-images\") pod \"machine-config-operator-74547568cd-wlgbd\" (UID: \"5f57ccda-46a6-46ca-883a-0ae41ed65a07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354632 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-268fs\" (UniqueName: \"kubernetes.io/projected/a35488ea-bcad-480f-9e21-756619b1ed3b-kube-api-access-268fs\") pod \"etcd-operator-b45778765-2r426\" (UID: \"a35488ea-bcad-480f-9e21-756619b1ed3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354654 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/62f09707-b36c-4651-88df-e9bc6dd527a4-trusted-ca\") pod \"ingress-operator-5b745b69d9-86sn4\" (UID: \"62f09707-b36c-4651-88df-e9bc6dd527a4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354694 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a35488ea-bcad-480f-9e21-756619b1ed3b-serving-cert\") pod \"etcd-operator-b45778765-2r426\" (UID: \"a35488ea-bcad-480f-9e21-756619b1ed3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354714 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/62f09707-b36c-4651-88df-e9bc6dd527a4-metrics-tls\") pod \"ingress-operator-5b745b69d9-86sn4\" (UID: \"62f09707-b36c-4651-88df-e9bc6dd527a4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354735 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1734b68-7b3d-49f0-9398-879da24fa19d-secret-volume\") pod \"collect-profiles-29431605-6v455\" (UID: \"a1734b68-7b3d-49f0-9398-879da24fa19d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354754 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/26e83196-9fbb-41ab-a359-d404437ee1e9-mountpoint-dir\") pod \"csi-hostpathplugin-t98bs\" (UID: \"26e83196-9fbb-41ab-a359-d404437ee1e9\") " pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354804 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354829 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f57ccda-46a6-46ca-883a-0ae41ed65a07-proxy-tls\") pod \"machine-config-operator-74547568cd-wlgbd\" (UID: \"5f57ccda-46a6-46ca-883a-0ae41ed65a07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354869 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q68lc\" (UniqueName: \"kubernetes.io/projected/a1734b68-7b3d-49f0-9398-879da24fa19d-kube-api-access-q68lc\") pod \"collect-profiles-29431605-6v455\" (UID: \"a1734b68-7b3d-49f0-9398-879da24fa19d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354926 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a35488ea-bcad-480f-9e21-756619b1ed3b-etcd-client\") pod \"etcd-operator-b45778765-2r426\" (UID: \"a35488ea-bcad-480f-9e21-756619b1ed3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354952 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/26e83196-9fbb-41ab-a359-d404437ee1e9-plugins-dir\") pod \"csi-hostpathplugin-t98bs\" (UID: \"26e83196-9fbb-41ab-a359-d404437ee1e9\") " pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.354986 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d72e9f6-db87-4192-8f08-32788a4ad601-metrics-certs\") pod \"router-default-5444994796-dw4sb\" (UID: \"5d72e9f6-db87-4192-8f08-32788a4ad601\") " pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.355008 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fswdd\" (UniqueName: \"kubernetes.io/projected/5d72e9f6-db87-4192-8f08-32788a4ad601-kube-api-access-fswdd\") pod \"router-default-5444994796-dw4sb\" (UID: \"5d72e9f6-db87-4192-8f08-32788a4ad601\") " pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.355042 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98nzv\" (UniqueName: \"kubernetes.io/projected/62f09707-b36c-4651-88df-e9bc6dd527a4-kube-api-access-98nzv\") pod \"ingress-operator-5b745b69d9-86sn4\" (UID: \"62f09707-b36c-4651-88df-e9bc6dd527a4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.355069 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-registry-certificates\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.355091 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a35488ea-bcad-480f-9e21-756619b1ed3b-config\") pod \"etcd-operator-b45778765-2r426\" (UID: \"a35488ea-bcad-480f-9e21-756619b1ed3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.355135 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a35488ea-bcad-480f-9e21-756619b1ed3b-etcd-ca\") pod \"etcd-operator-b45778765-2r426\" (UID: \"a35488ea-bcad-480f-9e21-756619b1ed3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.355156 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5d72e9f6-db87-4192-8f08-32788a4ad601-default-certificate\") pod \"router-default-5444994796-dw4sb\" (UID: \"5d72e9f6-db87-4192-8f08-32788a4ad601\") " pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.355184 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f057db8e-b509-48ad-967f-2ae735085e29-tmpfs\") pod \"packageserver-d55dfcdfc-jxc57\" (UID: \"f057db8e-b509-48ad-967f-2ae735085e29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.355225 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f057db8e-b509-48ad-967f-2ae735085e29-apiservice-cert\") pod \"packageserver-d55dfcdfc-jxc57\" (UID: \"f057db8e-b509-48ad-967f-2ae735085e29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.355269 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f057db8e-b509-48ad-967f-2ae735085e29-webhook-cert\") pod \"packageserver-d55dfcdfc-jxc57\" (UID: \"f057db8e-b509-48ad-967f-2ae735085e29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.355295 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m86bh\" (UniqueName: \"kubernetes.io/projected/5f57ccda-46a6-46ca-883a-0ae41ed65a07-kube-api-access-m86bh\") pod \"machine-config-operator-74547568cd-wlgbd\" (UID: \"5f57ccda-46a6-46ca-883a-0ae41ed65a07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.355318 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-trusted-ca\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.355366 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/26e83196-9fbb-41ab-a359-d404437ee1e9-socket-dir\") pod \"csi-hostpathplugin-t98bs\" (UID: \"26e83196-9fbb-41ab-a359-d404437ee1e9\") " pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.355391 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.355984 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j94w\" (UniqueName: \"kubernetes.io/projected/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-kube-api-access-5j94w\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.356009 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62f09707-b36c-4651-88df-e9bc6dd527a4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-86sn4\" (UID: \"62f09707-b36c-4651-88df-e9bc6dd527a4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.356048 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/56f482f0-f875-48f1-9a10-f06deeb5791e-metrics-tls\") pod \"dns-default-6kp7p\" (UID: \"56f482f0-f875-48f1-9a10-f06deeb5791e\") " pod="openshift-dns/dns-default-6kp7p" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.356074 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1734b68-7b3d-49f0-9398-879da24fa19d-config-volume\") pod \"collect-profiles-29431605-6v455\" (UID: \"a1734b68-7b3d-49f0-9398-879da24fa19d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.356117 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmgbn\" (UniqueName: \"kubernetes.io/projected/26e83196-9fbb-41ab-a359-d404437ee1e9-kube-api-access-xmgbn\") pod \"csi-hostpathplugin-t98bs\" (UID: \"26e83196-9fbb-41ab-a359-d404437ee1e9\") " pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.356179 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/88c27c93-f497-45d7-85f4-8e0804225421-metrics-tls\") pod \"dns-operator-744455d44c-t668z\" (UID: \"88c27c93-f497-45d7-85f4-8e0804225421\") " pod="openshift-dns-operator/dns-operator-744455d44c-t668z" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.356206 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5d72e9f6-db87-4192-8f08-32788a4ad601-stats-auth\") pod \"router-default-5444994796-dw4sb\" (UID: \"5d72e9f6-db87-4192-8f08-32788a4ad601\") " pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.356229 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/26e83196-9fbb-41ab-a359-d404437ee1e9-csi-data-dir\") pod \"csi-hostpathplugin-t98bs\" (UID: \"26e83196-9fbb-41ab-a359-d404437ee1e9\") " pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.356259 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-registry-tls\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.356293 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-bound-sa-token\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.356317 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7mpv\" (UniqueName: \"kubernetes.io/projected/f057db8e-b509-48ad-967f-2ae735085e29-kube-api-access-c7mpv\") pod \"packageserver-d55dfcdfc-jxc57\" (UID: \"f057db8e-b509-48ad-967f-2ae735085e29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.356389 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/26e83196-9fbb-41ab-a359-d404437ee1e9-registration-dir\") pod \"csi-hostpathplugin-t98bs\" (UID: \"26e83196-9fbb-41ab-a359-d404437ee1e9\") " pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.356416 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldc2j\" (UniqueName: \"kubernetes.io/projected/88c27c93-f497-45d7-85f4-8e0804225421-kube-api-access-ldc2j\") pod \"dns-operator-744455d44c-t668z\" (UID: \"88c27c93-f497-45d7-85f4-8e0804225421\") " pod="openshift-dns-operator/dns-operator-744455d44c-t668z" Dec 16 14:57:13 crc kubenswrapper[4775]: E1216 14:57:13.360486 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:13.860452721 +0000 UTC m=+158.811531644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.366017 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-d9jzf" Dec 16 14:57:13 crc kubenswrapper[4775]: W1216 14:57:13.366951 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b5f39f2_f4e2_4306_b64c_669ca82f8869.slice/crio-4d9a1b9ba38208559a24b67393be91b93f8c7ca0717a3308d006ad9eb7e44b84 WatchSource:0}: Error finding container 4d9a1b9ba38208559a24b67393be91b93f8c7ca0717a3308d006ad9eb7e44b84: Status 404 returned error can't find the container with id 4d9a1b9ba38208559a24b67393be91b93f8c7ca0717a3308d006ad9eb7e44b84 Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.376844 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pw4wv" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.391041 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" Dec 16 14:57:13 crc kubenswrapper[4775]: W1216 14:57:13.392311 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56428379_949d_4ba8_9b32_8ee7432abba7.slice/crio-a42ef9611564d7a9fb5487d49e9fbb9e5965a197439e5e34408fc746a05cfdbc WatchSource:0}: Error finding container a42ef9611564d7a9fb5487d49e9fbb9e5965a197439e5e34408fc746a05cfdbc: Status 404 returned error can't find the container with id a42ef9611564d7a9fb5487d49e9fbb9e5965a197439e5e34408fc746a05cfdbc Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.411391 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-54js6" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.435088 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.457710 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.457847 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/26e83196-9fbb-41ab-a359-d404437ee1e9-socket-dir\") pod \"csi-hostpathplugin-t98bs\" (UID: \"26e83196-9fbb-41ab-a359-d404437ee1e9\") " pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.457869 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.457952 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j94w\" (UniqueName: \"kubernetes.io/projected/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-kube-api-access-5j94w\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.457968 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62f09707-b36c-4651-88df-e9bc6dd527a4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-86sn4\" (UID: \"62f09707-b36c-4651-88df-e9bc6dd527a4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.457985 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/56f482f0-f875-48f1-9a10-f06deeb5791e-metrics-tls\") pod \"dns-default-6kp7p\" (UID: \"56f482f0-f875-48f1-9a10-f06deeb5791e\") " pod="openshift-dns/dns-default-6kp7p" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458017 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1734b68-7b3d-49f0-9398-879da24fa19d-config-volume\") pod \"collect-profiles-29431605-6v455\" (UID: \"a1734b68-7b3d-49f0-9398-879da24fa19d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458052 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmgbn\" (UniqueName: \"kubernetes.io/projected/26e83196-9fbb-41ab-a359-d404437ee1e9-kube-api-access-xmgbn\") pod \"csi-hostpathplugin-t98bs\" (UID: \"26e83196-9fbb-41ab-a359-d404437ee1e9\") " pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458133 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5d72e9f6-db87-4192-8f08-32788a4ad601-stats-auth\") pod \"router-default-5444994796-dw4sb\" (UID: \"5d72e9f6-db87-4192-8f08-32788a4ad601\") " pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458149 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/26e83196-9fbb-41ab-a359-d404437ee1e9-csi-data-dir\") pod \"csi-hostpathplugin-t98bs\" (UID: \"26e83196-9fbb-41ab-a359-d404437ee1e9\") " pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458167 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/88c27c93-f497-45d7-85f4-8e0804225421-metrics-tls\") pod \"dns-operator-744455d44c-t668z\" (UID: \"88c27c93-f497-45d7-85f4-8e0804225421\") " pod="openshift-dns-operator/dns-operator-744455d44c-t668z" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458216 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-registry-tls\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458248 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-bound-sa-token\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458263 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7mpv\" (UniqueName: \"kubernetes.io/projected/f057db8e-b509-48ad-967f-2ae735085e29-kube-api-access-c7mpv\") pod \"packageserver-d55dfcdfc-jxc57\" (UID: \"f057db8e-b509-48ad-967f-2ae735085e29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458413 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/26e83196-9fbb-41ab-a359-d404437ee1e9-registration-dir\") pod \"csi-hostpathplugin-t98bs\" (UID: \"26e83196-9fbb-41ab-a359-d404437ee1e9\") " pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458431 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldc2j\" (UniqueName: \"kubernetes.io/projected/88c27c93-f497-45d7-85f4-8e0804225421-kube-api-access-ldc2j\") pod \"dns-operator-744455d44c-t668z\" (UID: \"88c27c93-f497-45d7-85f4-8e0804225421\") " pod="openshift-dns-operator/dns-operator-744455d44c-t668z" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458446 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d72e9f6-db87-4192-8f08-32788a4ad601-service-ca-bundle\") pod \"router-default-5444994796-dw4sb\" (UID: \"5d72e9f6-db87-4192-8f08-32788a4ad601\") " pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458462 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5f57ccda-46a6-46ca-883a-0ae41ed65a07-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wlgbd\" (UID: \"5f57ccda-46a6-46ca-883a-0ae41ed65a07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458499 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56f482f0-f875-48f1-9a10-f06deeb5791e-config-volume\") pod \"dns-default-6kp7p\" (UID: \"56f482f0-f875-48f1-9a10-f06deeb5791e\") " pod="openshift-dns/dns-default-6kp7p" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458531 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a35488ea-bcad-480f-9e21-756619b1ed3b-etcd-service-ca\") pod \"etcd-operator-b45778765-2r426\" (UID: \"a35488ea-bcad-480f-9e21-756619b1ed3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458553 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ks4d\" (UniqueName: \"kubernetes.io/projected/56f482f0-f875-48f1-9a10-f06deeb5791e-kube-api-access-2ks4d\") pod \"dns-default-6kp7p\" (UID: \"56f482f0-f875-48f1-9a10-f06deeb5791e\") " pod="openshift-dns/dns-default-6kp7p" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458625 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-268fs\" (UniqueName: \"kubernetes.io/projected/a35488ea-bcad-480f-9e21-756619b1ed3b-kube-api-access-268fs\") pod \"etcd-operator-b45778765-2r426\" (UID: \"a35488ea-bcad-480f-9e21-756619b1ed3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458641 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5f57ccda-46a6-46ca-883a-0ae41ed65a07-images\") pod \"machine-config-operator-74547568cd-wlgbd\" (UID: \"5f57ccda-46a6-46ca-883a-0ae41ed65a07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458657 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/62f09707-b36c-4651-88df-e9bc6dd527a4-trusted-ca\") pod \"ingress-operator-5b745b69d9-86sn4\" (UID: \"62f09707-b36c-4651-88df-e9bc6dd527a4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458707 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a35488ea-bcad-480f-9e21-756619b1ed3b-serving-cert\") pod \"etcd-operator-b45778765-2r426\" (UID: \"a35488ea-bcad-480f-9e21-756619b1ed3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458722 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/62f09707-b36c-4651-88df-e9bc6dd527a4-metrics-tls\") pod \"ingress-operator-5b745b69d9-86sn4\" (UID: \"62f09707-b36c-4651-88df-e9bc6dd527a4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458745 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1734b68-7b3d-49f0-9398-879da24fa19d-secret-volume\") pod \"collect-profiles-29431605-6v455\" (UID: \"a1734b68-7b3d-49f0-9398-879da24fa19d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458780 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/26e83196-9fbb-41ab-a359-d404437ee1e9-mountpoint-dir\") pod \"csi-hostpathplugin-t98bs\" (UID: \"26e83196-9fbb-41ab-a359-d404437ee1e9\") " pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458798 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458820 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f57ccda-46a6-46ca-883a-0ae41ed65a07-proxy-tls\") pod \"machine-config-operator-74547568cd-wlgbd\" (UID: \"5f57ccda-46a6-46ca-883a-0ae41ed65a07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458855 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q68lc\" (UniqueName: \"kubernetes.io/projected/a1734b68-7b3d-49f0-9398-879da24fa19d-kube-api-access-q68lc\") pod \"collect-profiles-29431605-6v455\" (UID: \"a1734b68-7b3d-49f0-9398-879da24fa19d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458872 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a35488ea-bcad-480f-9e21-756619b1ed3b-etcd-client\") pod \"etcd-operator-b45778765-2r426\" (UID: \"a35488ea-bcad-480f-9e21-756619b1ed3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.458910 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/26e83196-9fbb-41ab-a359-d404437ee1e9-plugins-dir\") pod \"csi-hostpathplugin-t98bs\" (UID: \"26e83196-9fbb-41ab-a359-d404437ee1e9\") " pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: E1216 14:57:13.458979 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:13.95895316 +0000 UTC m=+158.910032083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.459043 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d72e9f6-db87-4192-8f08-32788a4ad601-metrics-certs\") pod \"router-default-5444994796-dw4sb\" (UID: \"5d72e9f6-db87-4192-8f08-32788a4ad601\") " pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.459105 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fswdd\" (UniqueName: \"kubernetes.io/projected/5d72e9f6-db87-4192-8f08-32788a4ad601-kube-api-access-fswdd\") pod \"router-default-5444994796-dw4sb\" (UID: \"5d72e9f6-db87-4192-8f08-32788a4ad601\") " pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.459188 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98nzv\" (UniqueName: \"kubernetes.io/projected/62f09707-b36c-4651-88df-e9bc6dd527a4-kube-api-access-98nzv\") pod \"ingress-operator-5b745b69d9-86sn4\" (UID: \"62f09707-b36c-4651-88df-e9bc6dd527a4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.459277 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-registry-certificates\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.459312 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a35488ea-bcad-480f-9e21-756619b1ed3b-config\") pod \"etcd-operator-b45778765-2r426\" (UID: \"a35488ea-bcad-480f-9e21-756619b1ed3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.459457 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a35488ea-bcad-480f-9e21-756619b1ed3b-etcd-ca\") pod \"etcd-operator-b45778765-2r426\" (UID: \"a35488ea-bcad-480f-9e21-756619b1ed3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.459478 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5d72e9f6-db87-4192-8f08-32788a4ad601-default-certificate\") pod \"router-default-5444994796-dw4sb\" (UID: \"5d72e9f6-db87-4192-8f08-32788a4ad601\") " pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.459568 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f057db8e-b509-48ad-967f-2ae735085e29-tmpfs\") pod \"packageserver-d55dfcdfc-jxc57\" (UID: \"f057db8e-b509-48ad-967f-2ae735085e29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.459588 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f057db8e-b509-48ad-967f-2ae735085e29-apiservice-cert\") pod \"packageserver-d55dfcdfc-jxc57\" (UID: \"f057db8e-b509-48ad-967f-2ae735085e29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.459633 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f057db8e-b509-48ad-967f-2ae735085e29-webhook-cert\") pod \"packageserver-d55dfcdfc-jxc57\" (UID: \"f057db8e-b509-48ad-967f-2ae735085e29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.459726 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-trusted-ca\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.459744 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m86bh\" (UniqueName: \"kubernetes.io/projected/5f57ccda-46a6-46ca-883a-0ae41ed65a07-kube-api-access-m86bh\") pod \"machine-config-operator-74547568cd-wlgbd\" (UID: \"5f57ccda-46a6-46ca-883a-0ae41ed65a07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.460406 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/26e83196-9fbb-41ab-a359-d404437ee1e9-socket-dir\") pod \"csi-hostpathplugin-t98bs\" (UID: \"26e83196-9fbb-41ab-a359-d404437ee1e9\") " pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.460628 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/26e83196-9fbb-41ab-a359-d404437ee1e9-registration-dir\") pod \"csi-hostpathplugin-t98bs\" (UID: \"26e83196-9fbb-41ab-a359-d404437ee1e9\") " pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.461347 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.461805 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5f57ccda-46a6-46ca-883a-0ae41ed65a07-images\") pod \"machine-config-operator-74547568cd-wlgbd\" (UID: \"5f57ccda-46a6-46ca-883a-0ae41ed65a07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.462418 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d72e9f6-db87-4192-8f08-32788a4ad601-service-ca-bundle\") pod \"router-default-5444994796-dw4sb\" (UID: \"5d72e9f6-db87-4192-8f08-32788a4ad601\") " pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.463651 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56f482f0-f875-48f1-9a10-f06deeb5791e-config-volume\") pod \"dns-default-6kp7p\" (UID: \"56f482f0-f875-48f1-9a10-f06deeb5791e\") " pod="openshift-dns/dns-default-6kp7p" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.464745 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5f57ccda-46a6-46ca-883a-0ae41ed65a07-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wlgbd\" (UID: \"5f57ccda-46a6-46ca-883a-0ae41ed65a07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.467475 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/62f09707-b36c-4651-88df-e9bc6dd527a4-trusted-ca\") pod \"ingress-operator-5b745b69d9-86sn4\" (UID: \"62f09707-b36c-4651-88df-e9bc6dd527a4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.468731 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f057db8e-b509-48ad-967f-2ae735085e29-tmpfs\") pod \"packageserver-d55dfcdfc-jxc57\" (UID: \"f057db8e-b509-48ad-967f-2ae735085e29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.469005 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-registry-certificates\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.469704 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a35488ea-bcad-480f-9e21-756619b1ed3b-config\") pod \"etcd-operator-b45778765-2r426\" (UID: \"a35488ea-bcad-480f-9e21-756619b1ed3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.470033 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1734b68-7b3d-49f0-9398-879da24fa19d-config-volume\") pod \"collect-profiles-29431605-6v455\" (UID: \"a1734b68-7b3d-49f0-9398-879da24fa19d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.470398 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2nrns" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.471238 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a35488ea-bcad-480f-9e21-756619b1ed3b-etcd-ca\") pod \"etcd-operator-b45778765-2r426\" (UID: \"a35488ea-bcad-480f-9e21-756619b1ed3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.471816 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/56f482f0-f875-48f1-9a10-f06deeb5791e-metrics-tls\") pod \"dns-default-6kp7p\" (UID: \"56f482f0-f875-48f1-9a10-f06deeb5791e\") " pod="openshift-dns/dns-default-6kp7p" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.472040 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/26e83196-9fbb-41ab-a359-d404437ee1e9-mountpoint-dir\") pod \"csi-hostpathplugin-t98bs\" (UID: \"26e83196-9fbb-41ab-a359-d404437ee1e9\") " pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.475238 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-trusted-ca\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.475314 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a35488ea-bcad-480f-9e21-756619b1ed3b-etcd-service-ca\") pod \"etcd-operator-b45778765-2r426\" (UID: \"a35488ea-bcad-480f-9e21-756619b1ed3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.475959 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/26e83196-9fbb-41ab-a359-d404437ee1e9-csi-data-dir\") pod \"csi-hostpathplugin-t98bs\" (UID: \"26e83196-9fbb-41ab-a359-d404437ee1e9\") " pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.481047 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f057db8e-b509-48ad-967f-2ae735085e29-webhook-cert\") pod \"packageserver-d55dfcdfc-jxc57\" (UID: \"f057db8e-b509-48ad-967f-2ae735085e29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.481389 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/26e83196-9fbb-41ab-a359-d404437ee1e9-plugins-dir\") pod \"csi-hostpathplugin-t98bs\" (UID: \"26e83196-9fbb-41ab-a359-d404437ee1e9\") " pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.481490 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d72e9f6-db87-4192-8f08-32788a4ad601-metrics-certs\") pod \"router-default-5444994796-dw4sb\" (UID: \"5d72e9f6-db87-4192-8f08-32788a4ad601\") " pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.486397 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f57ccda-46a6-46ca-883a-0ae41ed65a07-proxy-tls\") pod \"machine-config-operator-74547568cd-wlgbd\" (UID: \"5f57ccda-46a6-46ca-883a-0ae41ed65a07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.486707 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a35488ea-bcad-480f-9e21-756619b1ed3b-etcd-client\") pod \"etcd-operator-b45778765-2r426\" (UID: \"a35488ea-bcad-480f-9e21-756619b1ed3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.486853 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/88c27c93-f497-45d7-85f4-8e0804225421-metrics-tls\") pod \"dns-operator-744455d44c-t668z\" (UID: \"88c27c93-f497-45d7-85f4-8e0804225421\") " pod="openshift-dns-operator/dns-operator-744455d44c-t668z" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.491638 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f057db8e-b509-48ad-967f-2ae735085e29-apiservice-cert\") pod \"packageserver-d55dfcdfc-jxc57\" (UID: \"f057db8e-b509-48ad-967f-2ae735085e29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.492953 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a35488ea-bcad-480f-9e21-756619b1ed3b-serving-cert\") pod \"etcd-operator-b45778765-2r426\" (UID: \"a35488ea-bcad-480f-9e21-756619b1ed3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.495686 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5d72e9f6-db87-4192-8f08-32788a4ad601-default-certificate\") pod \"router-default-5444994796-dw4sb\" (UID: \"5d72e9f6-db87-4192-8f08-32788a4ad601\") " pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.496652 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.498461 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1734b68-7b3d-49f0-9398-879da24fa19d-secret-volume\") pod \"collect-profiles-29431605-6v455\" (UID: \"a1734b68-7b3d-49f0-9398-879da24fa19d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.507618 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-bound-sa-token\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.511355 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-registry-tls\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.516568 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/62f09707-b36c-4651-88df-e9bc6dd527a4-metrics-tls\") pod \"ingress-operator-5b745b69d9-86sn4\" (UID: \"62f09707-b36c-4651-88df-e9bc6dd527a4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.521313 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5d72e9f6-db87-4192-8f08-32788a4ad601-stats-auth\") pod \"router-default-5444994796-dw4sb\" (UID: \"5d72e9f6-db87-4192-8f08-32788a4ad601\") " pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.522116 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7mpv\" (UniqueName: \"kubernetes.io/projected/f057db8e-b509-48ad-967f-2ae735085e29-kube-api-access-c7mpv\") pod \"packageserver-d55dfcdfc-jxc57\" (UID: \"f057db8e-b509-48ad-967f-2ae735085e29\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.547631 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ks4d\" (UniqueName: \"kubernetes.io/projected/56f482f0-f875-48f1-9a10-f06deeb5791e-kube-api-access-2ks4d\") pod \"dns-default-6kp7p\" (UID: \"56f482f0-f875-48f1-9a10-f06deeb5791e\") " pod="openshift-dns/dns-default-6kp7p" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.565685 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: E1216 14:57:13.574126 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:14.0741064 +0000 UTC m=+159.025185323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.574353 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldc2j\" (UniqueName: \"kubernetes.io/projected/88c27c93-f497-45d7-85f4-8e0804225421-kube-api-access-ldc2j\") pod \"dns-operator-744455d44c-t668z\" (UID: \"88c27c93-f497-45d7-85f4-8e0804225421\") " pod="openshift-dns-operator/dns-operator-744455d44c-t668z" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.577048 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-268fs\" (UniqueName: \"kubernetes.io/projected/a35488ea-bcad-480f-9e21-756619b1ed3b-kube-api-access-268fs\") pod \"etcd-operator-b45778765-2r426\" (UID: \"a35488ea-bcad-480f-9e21-756619b1ed3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.615350 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j94w\" (UniqueName: \"kubernetes.io/projected/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-kube-api-access-5j94w\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.633060 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62f09707-b36c-4651-88df-e9bc6dd527a4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-86sn4\" (UID: \"62f09707-b36c-4651-88df-e9bc6dd527a4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.659649 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b5vwq"] Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.678779 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:13 crc kubenswrapper[4775]: E1216 14:57:13.679171 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:14.179109592 +0000 UTC m=+159.130188525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.679792 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7v5ck"] Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.679806 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: E1216 14:57:13.680333 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:14.18031587 +0000 UTC m=+159.131394793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.701120 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv"] Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.704102 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-t668z" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.707085 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98nzv\" (UniqueName: \"kubernetes.io/projected/62f09707-b36c-4651-88df-e9bc6dd527a4-kube-api-access-98nzv\") pod \"ingress-operator-5b745b69d9-86sn4\" (UID: \"62f09707-b36c-4651-88df-e9bc6dd527a4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.720737 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz"] Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.721145 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fswdd\" (UniqueName: \"kubernetes.io/projected/5d72e9f6-db87-4192-8f08-32788a4ad601-kube-api-access-fswdd\") pod \"router-default-5444994796-dw4sb\" (UID: \"5d72e9f6-db87-4192-8f08-32788a4ad601\") " pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.726918 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m86bh\" (UniqueName: \"kubernetes.io/projected/5f57ccda-46a6-46ca-883a-0ae41ed65a07-kube-api-access-m86bh\") pod \"machine-config-operator-74547568cd-wlgbd\" (UID: \"5f57ccda-46a6-46ca-883a-0ae41ed65a07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.727333 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmgbn\" (UniqueName: \"kubernetes.io/projected/26e83196-9fbb-41ab-a359-d404437ee1e9-kube-api-access-xmgbn\") pod \"csi-hostpathplugin-t98bs\" (UID: \"26e83196-9fbb-41ab-a359-d404437ee1e9\") " pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.739223 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9gmd5"] Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.748467 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.748991 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.763371 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.765413 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q68lc\" (UniqueName: \"kubernetes.io/projected/a1734b68-7b3d-49f0-9398-879da24fa19d-kube-api-access-q68lc\") pod \"collect-profiles-29431605-6v455\" (UID: \"a1734b68-7b3d-49f0-9398-879da24fa19d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.783147 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:13 crc kubenswrapper[4775]: E1216 14:57:13.783278 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:14.283249207 +0000 UTC m=+159.234328130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.783501 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:13 crc kubenswrapper[4775]: E1216 14:57:13.784117 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:14.284035382 +0000 UTC m=+159.235114305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.808195 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6kp7p" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.808247 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-t98bs" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.826328 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.892308 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:13 crc kubenswrapper[4775]: E1216 14:57:13.892691 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:14.392670527 +0000 UTC m=+159.343749450 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:13 crc kubenswrapper[4775]: I1216 14:57:13.969349 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:13.993948 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:14 crc kubenswrapper[4775]: E1216 14:57:13.997762 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:14.497743362 +0000 UTC m=+159.448822285 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.031326 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455" Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.099526 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:14 crc kubenswrapper[4775]: E1216 14:57:14.099937 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:14.599918605 +0000 UTC m=+159.550997528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.200997 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:14 crc kubenswrapper[4775]: E1216 14:57:14.201429 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:14.701416837 +0000 UTC m=+159.652495760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.297493 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v798k"] Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.303746 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:14 crc kubenswrapper[4775]: E1216 14:57:14.304302 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:14.804273861 +0000 UTC m=+159.755352784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.322967 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-fc2jr"] Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.329876 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2tkrk"] Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.331044 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rpxqq"] Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.366754 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw" event={"ID":"56428379-949d-4ba8-9b32-8ee7432abba7","Type":"ContainerStarted","Data":"a42ef9611564d7a9fb5487d49e9fbb9e5965a197439e5e34408fc746a05cfdbc"} Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.393095 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" event={"ID":"d55038e1-9978-48f8-b430-78c7da1ca5e5","Type":"ContainerStarted","Data":"577b6b50f7753b78e5b0f3723e988ca6f08ccd40d0fff4255ec36ec9adfbf63a"} Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.406189 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:14 crc kubenswrapper[4775]: E1216 14:57:14.406619 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:14.90660128 +0000 UTC m=+159.857680213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.413685 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" event={"ID":"be0d6859-aa4c-4a58-97ea-3f3657d4773f","Type":"ContainerStarted","Data":"31dd977b8ac18929e81965d718cf97679e0a834b9a2c6d2fecd6c4e75820815c"} Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.437333 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b5vwq" event={"ID":"4b6576b0-b9f9-4599-8876-c9b7b0a60a43","Type":"ContainerStarted","Data":"a432784575de9a73448fc97165f289f67c0f665763d41e9ab96637909dbda757"} Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.457657 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-bgqpz" podStartSLOduration=137.45762853 podStartE2EDuration="2m17.45762853s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:14.455767691 +0000 UTC m=+159.406846624" watchObservedRunningTime="2025-12-16 14:57:14.45762853 +0000 UTC m=+159.408707453" Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.458619 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-27jm2" podStartSLOduration=137.45860917 podStartE2EDuration="2m17.45860917s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:14.418135802 +0000 UTC m=+159.369214725" watchObservedRunningTime="2025-12-16 14:57:14.45860917 +0000 UTC m=+159.409688093" Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.460348 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-27jm2" event={"ID":"83cafff0-05df-46bf-a875-f03ad748a9fc","Type":"ContainerStarted","Data":"b59520c69f2f706cd9db9fcb74f5041272c21f7ef078022c14c273ce5f6a649a"} Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.487125 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk" event={"ID":"72557843-c8c0-4228-89fc-e735935e10a3","Type":"ContainerStarted","Data":"227f7474698d955ab0b5fe991fe65982149b817f1474b803400eb8b02aed1317"} Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.528211 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.528745 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk" Dec 16 14:57:14 crc kubenswrapper[4775]: E1216 14:57:14.538292 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:15.038256578 +0000 UTC m=+159.989335491 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.544979 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9gmd5" event={"ID":"24ba18d0-a989-4a98-99d3-bbdb42ce4bb9","Type":"ContainerStarted","Data":"4b48cee4d689d9d092703b07d1683b233db97728814386a4eb433713677268f3"} Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.549552 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv" event={"ID":"df95f79a-fd5f-4a8d-a062-03ce319dc30b","Type":"ContainerStarted","Data":"f73ca8dc7d11922de5af27a648101b73fa28eb047470c2265d2db21f7b56aae2"} Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.554509 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-d9jzf"] Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.646402 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5bmtw" event={"ID":"edb75c2a-9e6f-4a80-aadd-38416ba9c9a4","Type":"ContainerStarted","Data":"99309726aede4b20c628049db2a33f08ae517b980c45f53da7237c9adf496048"} Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.646668 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5bmtw" event={"ID":"edb75c2a-9e6f-4a80-aadd-38416ba9c9a4","Type":"ContainerStarted","Data":"c1f8b9ab83197f6a1da04c2fbc7bc47386b528d9235e11fc4df291e1789aabd1"} Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.647585 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.649572 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5bmtw" Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.659608 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ql6wt"] Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.659985 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-5bmtw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.660025 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5bmtw" podUID="edb75c2a-9e6f-4a80-aadd-38416ba9c9a4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Dec 16 14:57:14 crc kubenswrapper[4775]: W1216 14:57:14.669458 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40f1d8c0_d195_457c_909e_10fd294a0bfc.slice/crio-edac14f0ebbfbb4e4f1ec16806231cd66b39ee57b85b0bd691a4275258e3d578 WatchSource:0}: Error finding container edac14f0ebbfbb4e4f1ec16806231cd66b39ee57b85b0bd691a4275258e3d578: Status 404 returned error can't find the container with id edac14f0ebbfbb4e4f1ec16806231cd66b39ee57b85b0bd691a4275258e3d578 Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.669677 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2nrns" event={"ID":"fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d","Type":"ContainerStarted","Data":"5677f06c202121d2680588649ec3cd38a1563a03aec7a07b493f424eec4f65cb"} Dec 16 14:57:14 crc kubenswrapper[4775]: E1216 14:57:14.676610 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:15.176592534 +0000 UTC m=+160.127671457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.728558 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7v5ck" event={"ID":"e62a9230-6690-4db9-a808-60b544283182","Type":"ContainerStarted","Data":"463d363310e10aeb6608b9dab5d2211c860d93784adfaeb636c222d0d231ad70"} Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.743785 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s88th" event={"ID":"42c55268-c122-4eb0-9508-b5507897990b","Type":"ContainerStarted","Data":"782cacc29c0bc11647a93414d30a748174602f83dacf22ba522366809e4b3a68"} Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.750410 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:14 crc kubenswrapper[4775]: E1216 14:57:14.751674 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:15.251654798 +0000 UTC m=+160.202733711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.767810 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zmgzc"] Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.771694 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz" event={"ID":"2df1b126-33ca-4ad8-911b-44c4f1457b99","Type":"ContainerStarted","Data":"fb8e97cf47515041211c59c554bc1a40acf75b9b0bc64926fa3bcead4964f560"} Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.792651 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-54js6"] Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.837689 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5fjmz" event={"ID":"e1a2834e-159c-47f0-81a8-87d37d89a22a","Type":"ContainerStarted","Data":"cda3794217cf9b73cf811b85bc5998eae8c805f4866bfaa1c0b1e314870c177e"} Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.837735 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5fjmz" event={"ID":"e1a2834e-159c-47f0-81a8-87d37d89a22a","Type":"ContainerStarted","Data":"79b4c48f81673d23f4810042d2c13cb30cadeba350492522410bfb414f599b2d"} Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.852116 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:14 crc kubenswrapper[4775]: E1216 14:57:14.853935 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:15.353917094 +0000 UTC m=+160.304996017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.877473 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" event={"ID":"2b5f39f2-f4e2-4306-b64c-669ca82f8869","Type":"ContainerStarted","Data":"4d9a1b9ba38208559a24b67393be91b93f8c7ca0717a3308d006ad9eb7e44b84"} Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.895348 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pw4wv"] Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.895767 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-bgqpz" Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.904201 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.918211 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" podStartSLOduration=137.918185369 podStartE2EDuration="2m17.918185369s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:14.917778676 +0000 UTC m=+159.868857609" watchObservedRunningTime="2025-12-16 14:57:14.918185369 +0000 UTC m=+159.869264302" Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.952978 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:14 crc kubenswrapper[4775]: E1216 14:57:14.954481 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:15.454462336 +0000 UTC m=+160.405541259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.982873 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-58rfh"] Dec 16 14:57:14 crc kubenswrapper[4775]: W1216 14:57:14.991464 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ed9516f_e373_480b_a645_ad35bec98fa4.slice/crio-dd5b6e5c60517866b0794ed76b1653aa2f7627d9cad26b4e6b2e861586e74738 WatchSource:0}: Error finding container dd5b6e5c60517866b0794ed76b1653aa2f7627d9cad26b4e6b2e861586e74738: Status 404 returned error can't find the container with id dd5b6e5c60517866b0794ed76b1653aa2f7627d9cad26b4e6b2e861586e74738 Dec 16 14:57:14 crc kubenswrapper[4775]: I1216 14:57:14.994611 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649"] Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.005887 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw52" podStartSLOduration=138.005871038 podStartE2EDuration="2m18.005871038s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:15.003693169 +0000 UTC m=+159.954772092" watchObservedRunningTime="2025-12-16 14:57:15.005871038 +0000 UTC m=+159.956949961" Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.070918 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:15 crc kubenswrapper[4775]: E1216 14:57:15.071347 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:15.57133283 +0000 UTC m=+160.522411753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.106323 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rfplx"] Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.223061 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:15 crc kubenswrapper[4775]: E1216 14:57:15.242992 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:15.74294166 +0000 UTC m=+160.694020593 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:15 crc kubenswrapper[4775]: W1216 14:57:15.302706 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8afbcbd_ab0e_44cf_98fc_fa5c2c50647a.slice/crio-3756cbc3e6265d7204ab0ba345fe9f49594e86916e7a2a9b3341dc0c1990353b WatchSource:0}: Error finding container 3756cbc3e6265d7204ab0ba345fe9f49594e86916e7a2a9b3341dc0c1990353b: Status 404 returned error can't find the container with id 3756cbc3e6265d7204ab0ba345fe9f49594e86916e7a2a9b3341dc0c1990353b Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.344627 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:15 crc kubenswrapper[4775]: E1216 14:57:15.345822 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:15.845800725 +0000 UTC m=+160.796879648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:15 crc kubenswrapper[4775]: W1216 14:57:15.355330 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49286131_fb39_4261_bc4f_db68474c8fa0.slice/crio-d32155508bfbd20a8a25de447dcfbb42c90e03ed93815b4f3a6501cbfb1d8374 WatchSource:0}: Error finding container d32155508bfbd20a8a25de447dcfbb42c90e03ed93815b4f3a6501cbfb1d8374: Status 404 returned error can't find the container with id d32155508bfbd20a8a25de447dcfbb42c90e03ed93815b4f3a6501cbfb1d8374 Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.447313 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" podStartSLOduration=137.447288137 podStartE2EDuration="2m17.447288137s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:15.36701622 +0000 UTC m=+160.318095143" watchObservedRunningTime="2025-12-16 14:57:15.447288137 +0000 UTC m=+160.398367060" Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.453510 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:15 crc kubenswrapper[4775]: E1216 14:57:15.454784 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:15.954760611 +0000 UTC m=+160.905839544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.477495 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89cwl" podStartSLOduration=138.477473603 podStartE2EDuration="2m18.477473603s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:15.437105808 +0000 UTC m=+160.388184741" watchObservedRunningTime="2025-12-16 14:57:15.477473603 +0000 UTC m=+160.428552526" Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.535118 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-t668z"] Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.555018 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:15 crc kubenswrapper[4775]: E1216 14:57:15.555527 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:16.055511109 +0000 UTC m=+161.006590032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.564871 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd"] Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.624350 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4"] Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.655508 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:15 crc kubenswrapper[4775]: E1216 14:57:15.656087 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:16.156064782 +0000 UTC m=+161.107143705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.750170 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2r426"] Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.761415 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:15 crc kubenswrapper[4775]: E1216 14:57:15.761888 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:16.26187165 +0000 UTC m=+161.212950573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.792249 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57"] Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.842060 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6kp7p"] Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.862654 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:15 crc kubenswrapper[4775]: E1216 14:57:15.863220 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:16.363194975 +0000 UTC m=+161.314273898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.895888 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s88th" podStartSLOduration=138.89586924 podStartE2EDuration="2m18.89586924s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:15.893661531 +0000 UTC m=+160.844740474" watchObservedRunningTime="2025-12-16 14:57:15.89586924 +0000 UTC m=+160.846948163" Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.953040 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455"] Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.965778 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:15 crc kubenswrapper[4775]: E1216 14:57:15.966187 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:16.466170495 +0000 UTC m=+161.417249428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.970785 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" podStartSLOduration=138.970762678 podStartE2EDuration="2m18.970762678s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:15.952177786 +0000 UTC m=+160.903256729" watchObservedRunningTime="2025-12-16 14:57:15.970762678 +0000 UTC m=+160.921841601" Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.987221 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rpxqq" event={"ID":"35f10989-95d7-4557-9ba3-b2e966a63938","Type":"ContainerStarted","Data":"0b596ce8bdbec36ef4836c31e5b8b48ca14c3cb4c1fda49de96fa711e021a66d"} Dec 16 14:57:15 crc kubenswrapper[4775]: I1216 14:57:15.987269 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rpxqq" event={"ID":"35f10989-95d7-4557-9ba3-b2e966a63938","Type":"ContainerStarted","Data":"c74f038f2b7cdccab8ebe73b9cc2806951c33eedb3815c5a0255c3b543ef4a3d"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:15.999381 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ql6wt" event={"ID":"0eaef920-ee71-48f6-b577-02528a4ec363","Type":"ContainerStarted","Data":"29196fae2664b7fc8cbfcc036bc45811e8be43a2dd1b694b6df4fd153d3c4cd1"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.012475 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw" event={"ID":"56428379-949d-4ba8-9b32-8ee7432abba7","Type":"ContainerStarted","Data":"6fe12fa110ef2449cd0040923e4e56a7fd1dbb744df204548c4c809f3f66d004"} Dec 16 14:57:16 crc kubenswrapper[4775]: W1216 14:57:16.021661 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56f482f0_f875_48f1_9a10_f06deeb5791e.slice/crio-5d287be0ee208dcaad1063455e07d42a0e22e485d443fc80cc01de09b68618a2 WatchSource:0}: Error finding container 5d287be0ee208dcaad1063455e07d42a0e22e485d443fc80cc01de09b68618a2: Status 404 returned error can't find the container with id 5d287be0ee208dcaad1063455e07d42a0e22e485d443fc80cc01de09b68618a2 Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.022086 4775 generic.go:334] "Generic (PLEG): container finished" podID="be0d6859-aa4c-4a58-97ea-3f3657d4773f" containerID="691c7e9dfd53f93ca40eedf8d0da3fbc651f256462ea3366ac3d253f9c02a026" exitCode=0 Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.022140 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" event={"ID":"be0d6859-aa4c-4a58-97ea-3f3657d4773f","Type":"ContainerDied","Data":"691c7e9dfd53f93ca40eedf8d0da3fbc651f256462ea3366ac3d253f9c02a026"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.060000 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7v5ck" event={"ID":"e62a9230-6690-4db9-a808-60b544283182","Type":"ContainerStarted","Data":"5676a40c4e5e912c145e404c0cb09bd377421cf8e681367d22fdf9def00907ed"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.068823 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:16 crc kubenswrapper[4775]: E1216 14:57:16.083558 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:16.583530784 +0000 UTC m=+161.534609707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.131328 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5fjmz" event={"ID":"e1a2834e-159c-47f0-81a8-87d37d89a22a","Type":"ContainerStarted","Data":"68960a9b742eb9694540aa8b3b8eb9bef2495bbf59ee5e2497580fd7d8b5eab1"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.155931 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-t98bs"] Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.170068 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zmgzc" event={"ID":"11dbd98a-eb9e-4d5f-b52d-df105cbeb83c","Type":"ContainerStarted","Data":"c6d10093a7a67e248c58907bd2ed9841e014a064789135d149ac5b8adabde235"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.171672 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:16 crc kubenswrapper[4775]: E1216 14:57:16.172914 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:16.672874405 +0000 UTC m=+161.623953328 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.198372 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv" event={"ID":"df95f79a-fd5f-4a8d-a062-03ce319dc30b","Type":"ContainerStarted","Data":"a95db49b54866a4915dd36a9dc005230b50c8a52f2a4cedd7bea7e931168292f"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.199329 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv" Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.209274 4775 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-lj8mv container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.209336 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv" podUID="df95f79a-fd5f-4a8d-a062-03ce319dc30b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.224295 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-t668z" event={"ID":"88c27c93-f497-45d7-85f4-8e0804225421","Type":"ContainerStarted","Data":"77aef0712968bcbd2cb68a450697a06132a3b2917b9c073b2357fa15d3145df3"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.245954 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2tkrk" event={"ID":"a492f5f7-b613-4e56-8071-78f8c836e7c3","Type":"ContainerStarted","Data":"8aefae9c8d5f743145e4680305fb7f62fbe174b3f749627ecaa786cc2c7e0558"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.245995 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2tkrk" event={"ID":"a492f5f7-b613-4e56-8071-78f8c836e7c3","Type":"ContainerStarted","Data":"6cd2a8e7a5285b866ceb170d718bb3d6c01dcab1d644abbcba20c8f5c83436ae"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.259368 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-5bmtw" podStartSLOduration=139.259347906 podStartE2EDuration="2m19.259347906s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:16.256433534 +0000 UTC m=+161.207512457" watchObservedRunningTime="2025-12-16 14:57:16.259347906 +0000 UTC m=+161.210426829" Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.268118 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b5vwq" event={"ID":"4b6576b0-b9f9-4599-8876-c9b7b0a60a43","Type":"ContainerStarted","Data":"6966ca4e8bab57953eb6ef7510d67cd0de183167b916c664b687f65a26d95a0a"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.272247 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:16 crc kubenswrapper[4775]: E1216 14:57:16.273603 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:16.773580412 +0000 UTC m=+161.724659425 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.319976 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9gmd5" event={"ID":"24ba18d0-a989-4a98-99d3-bbdb42ce4bb9","Type":"ContainerStarted","Data":"b7f800a08dba6681d716603d4a13dd5370e0a4f5cfa16fea522e4c4bb08fa8ae"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.330043 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-fc2jr" event={"ID":"c21af7b0-6f27-43de-8c44-6e6519262019","Type":"ContainerStarted","Data":"fb988c5ad1b7d22b38dc7e1aab4ca8cf2e49435fdfaff32984e0beb3e9fea217"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.330104 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-fc2jr" event={"ID":"c21af7b0-6f27-43de-8c44-6e6519262019","Type":"ContainerStarted","Data":"9f1165088cb46e8c739d4ada51d43545483d6747c859bc60be90e871f9491b15"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.367248 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2nrns" event={"ID":"fddeaa6b-ef05-453d-a1e8-52a6de3a1a0d","Type":"ContainerStarted","Data":"a55acf79a03d85d163fd4962937f5def64d1801cc003c3daa59e55426ed13f76"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.373336 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:16 crc kubenswrapper[4775]: E1216 14:57:16.374757 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:16.874739134 +0000 UTC m=+161.825818157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.386306 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" event={"ID":"2b5f39f2-f4e2-4306-b64c-669ca82f8869","Type":"ContainerStarted","Data":"0ac76ce0bbd6945ba21bf822d8c7383327a70e49e84cbe910367f5364d7d98a8"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.387624 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.396577 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk" podStartSLOduration=139.396555117 podStartE2EDuration="2m19.396555117s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:16.394616237 +0000 UTC m=+161.345695180" watchObservedRunningTime="2025-12-16 14:57:16.396555117 +0000 UTC m=+161.347634040" Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.468822 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rpxqq" podStartSLOduration=138.468803852 podStartE2EDuration="2m18.468803852s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:16.466433978 +0000 UTC m=+161.417512901" watchObservedRunningTime="2025-12-16 14:57:16.468803852 +0000 UTC m=+161.419882775" Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.476466 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:16 crc kubenswrapper[4775]: E1216 14:57:16.477737 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:16.977715682 +0000 UTC m=+161.928794605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.488292 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4" event={"ID":"62f09707-b36c-4651-88df-e9bc6dd527a4","Type":"ContainerStarted","Data":"a3e231e3d3deb53b100f984f244cf57e4a55109c0ae8d19d0cda5365ab4ea655"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.524171 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-5fjmz" podStartSLOduration=138.524145137 podStartE2EDuration="2m18.524145137s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:16.517336454 +0000 UTC m=+161.468415377" watchObservedRunningTime="2025-12-16 14:57:16.524145137 +0000 UTC m=+161.475224080" Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.572255 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" event={"ID":"d55038e1-9978-48f8-b430-78c7da1ca5e5","Type":"ContainerStarted","Data":"82a0e88abaf9440394734859342b2537eaf6d2058ae8b0909a9e9fea73da8293"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.577945 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:16 crc kubenswrapper[4775]: E1216 14:57:16.579374 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:17.079360269 +0000 UTC m=+162.030439192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.586875 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9gmd5" podStartSLOduration=138.586858544 podStartE2EDuration="2m18.586858544s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:16.584582342 +0000 UTC m=+161.535661275" watchObservedRunningTime="2025-12-16 14:57:16.586858544 +0000 UTC m=+161.537937467" Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.632220 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-2nrns" podStartSLOduration=6.632194215 podStartE2EDuration="6.632194215s" podCreationTimestamp="2025-12-16 14:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:16.630415339 +0000 UTC m=+161.581494262" watchObservedRunningTime="2025-12-16 14:57:16.632194215 +0000 UTC m=+161.583273138" Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.698157 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:16 crc kubenswrapper[4775]: E1216 14:57:16.699160 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:17.199140594 +0000 UTC m=+162.150219517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.700245 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" event={"ID":"b36ff831-d91c-4350-a36b-bd0625ffb661","Type":"ContainerStarted","Data":"2ee3f783093c346e7cbb789197f327f74cb8c31422af043f96c4391e13e77b8d"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.724342 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-54js6" event={"ID":"e45773ce-d026-4240-bde4-17339b57ef93","Type":"ContainerStarted","Data":"1124b4a25bdacb58b6aa46d4469da5ba3979f2d15305e70414ab90fe54bfa928"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.780724 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649" event={"ID":"49286131-fb39-4261-bc4f-db68474c8fa0","Type":"ContainerStarted","Data":"d32155508bfbd20a8a25de447dcfbb42c90e03ed93815b4f3a6501cbfb1d8374"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.783204 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649" Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.783308 4775 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-74649 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.783354 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649" podUID="49286131-fb39-4261-bc4f-db68474c8fa0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.785040 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-fc2jr" podStartSLOduration=139.785025276 podStartE2EDuration="2m19.785025276s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:16.783714406 +0000 UTC m=+161.734793349" watchObservedRunningTime="2025-12-16 14:57:16.785025276 +0000 UTC m=+161.736104199" Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.815348 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:16 crc kubenswrapper[4775]: E1216 14:57:16.816618 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:17.316604227 +0000 UTC m=+162.267683150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.850591 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pw4wv" event={"ID":"9ed9516f-e373-480b-a645-ad35bec98fa4","Type":"ContainerStarted","Data":"dd5b6e5c60517866b0794ed76b1653aa2f7627d9cad26b4e6b2e861586e74738"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.895775 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv" podStartSLOduration=138.895753068 podStartE2EDuration="2m18.895753068s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:16.848196027 +0000 UTC m=+161.799274980" watchObservedRunningTime="2025-12-16 14:57:16.895753068 +0000 UTC m=+161.846831991" Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.900645 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-dw4sb" event={"ID":"5d72e9f6-db87-4192-8f08-32788a4ad601","Type":"ContainerStarted","Data":"4707b0406369b31d83886dbbf951fbb3caa0802efb2592f581c05d87947e945b"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.925102 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:16 crc kubenswrapper[4775]: E1216 14:57:16.925539 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:17.425521271 +0000 UTC m=+162.376600184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.958142 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rfplx" event={"ID":"b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a","Type":"ContainerStarted","Data":"3756cbc3e6265d7204ab0ba345fe9f49594e86916e7a2a9b3341dc0c1990353b"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.959794 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7v5ck" podStartSLOduration=6.959782666 podStartE2EDuration="6.959782666s" podCreationTimestamp="2025-12-16 14:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:16.954292513 +0000 UTC m=+161.905371436" watchObservedRunningTime="2025-12-16 14:57:16.959782666 +0000 UTC m=+161.910861589" Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.959925 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2tkrk" podStartSLOduration=138.95992109 podStartE2EDuration="2m18.95992109s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:16.904283406 +0000 UTC m=+161.855362329" watchObservedRunningTime="2025-12-16 14:57:16.95992109 +0000 UTC m=+161.911000013" Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.976182 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.983513 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz" event={"ID":"2df1b126-33ca-4ad8-911b-44c4f1457b99","Type":"ContainerStarted","Data":"de17318edcee7e0dcdaaaa4934863a0696d08b6546a04930fcbfb88643c56a41"} Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.991400 4775 patch_prober.go:28] interesting pod/router-default-5444994796-dw4sb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:57:16 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Dec 16 14:57:16 crc kubenswrapper[4775]: [+]process-running ok Dec 16 14:57:16 crc kubenswrapper[4775]: healthz check failed Dec 16 14:57:16 crc kubenswrapper[4775]: I1216 14:57:16.991484 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dw4sb" podUID="5d72e9f6-db87-4192-8f08-32788a4ad601" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.000931 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649" podStartSLOduration=139.000914195 podStartE2EDuration="2m19.000914195s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:16.998990825 +0000 UTC m=+161.950069778" watchObservedRunningTime="2025-12-16 14:57:17.000914195 +0000 UTC m=+161.951993128" Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.018229 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-d9jzf" event={"ID":"40f1d8c0-d195-457c-909e-10fd294a0bfc","Type":"ContainerStarted","Data":"edac14f0ebbfbb4e4f1ec16806231cd66b39ee57b85b0bd691a4275258e3d578"} Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.031787 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:17 crc kubenswrapper[4775]: E1216 14:57:17.033679 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:17.533665231 +0000 UTC m=+162.484744154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.081328 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v798k" event={"ID":"4e5a6b47-360c-4b64-9ba3-15edeb2006fa","Type":"ContainerStarted","Data":"07c64da1ce5964147eacebcb9cce517707bfa3c5031a73d6c82512bece6b878f"} Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.099721 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd" event={"ID":"5f57ccda-46a6-46ca-883a-0ae41ed65a07","Type":"ContainerStarted","Data":"ad857ee2f175c8557a2c3faf4aa33a03a0afc49c7514a368c29e7da4a90ae4cb"} Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.101108 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-5bmtw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.101167 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5bmtw" podUID="edb75c2a-9e6f-4a80-aadd-38416ba9c9a4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.110565 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" podStartSLOduration=140.110546902 podStartE2EDuration="2m20.110546902s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:17.107026501 +0000 UTC m=+162.058105434" watchObservedRunningTime="2025-12-16 14:57:17.110546902 +0000 UTC m=+162.061625825" Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.114050 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-dw4sb" podStartSLOduration=139.114029931 podStartE2EDuration="2m19.114029931s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:17.060630147 +0000 UTC m=+162.011709070" watchObservedRunningTime="2025-12-16 14:57:17.114029931 +0000 UTC m=+162.065108854" Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.132939 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:17 crc kubenswrapper[4775]: E1216 14:57:17.133339 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:17.633321826 +0000 UTC m=+162.584400749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.139801 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-rfplx" podStartSLOduration=139.139777118 podStartE2EDuration="2m19.139777118s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:17.128915567 +0000 UTC m=+162.079994490" watchObservedRunningTime="2025-12-16 14:57:17.139777118 +0000 UTC m=+162.090856041" Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.152569 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c58rz" podStartSLOduration=139.152551999 podStartE2EDuration="2m19.152551999s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:17.147316285 +0000 UTC m=+162.098395198" watchObservedRunningTime="2025-12-16 14:57:17.152551999 +0000 UTC m=+162.103630922" Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.162780 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xrjhk" Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.237964 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:17 crc kubenswrapper[4775]: E1216 14:57:17.240848 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:17.740836936 +0000 UTC m=+162.691915859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.340395 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:17 crc kubenswrapper[4775]: E1216 14:57:17.340692 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:17.840656066 +0000 UTC m=+162.791734999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.340791 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:17 crc kubenswrapper[4775]: E1216 14:57:17.341192 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:17.841176443 +0000 UTC m=+162.792255366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.389076 4775 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-ms9lk container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.14:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.389144 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" podUID="2b5f39f2-f4e2-4306-b64c-669ca82f8869" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.14:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.407286 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.407317 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.409744 4775 patch_prober.go:28] interesting pod/apiserver-76f77b778f-nxkgw container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 16 14:57:17 crc kubenswrapper[4775]: [+]log ok Dec 16 14:57:17 crc kubenswrapper[4775]: [+]etcd ok Dec 16 14:57:17 crc kubenswrapper[4775]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 16 14:57:17 crc kubenswrapper[4775]: [+]poststarthook/generic-apiserver-start-informers ok Dec 16 14:57:17 crc kubenswrapper[4775]: [+]poststarthook/max-in-flight-filter ok Dec 16 14:57:17 crc kubenswrapper[4775]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 16 14:57:17 crc kubenswrapper[4775]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 16 14:57:17 crc kubenswrapper[4775]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 16 14:57:17 crc kubenswrapper[4775]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 16 14:57:17 crc kubenswrapper[4775]: [+]poststarthook/project.openshift.io-projectcache ok Dec 16 14:57:17 crc kubenswrapper[4775]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 16 14:57:17 crc kubenswrapper[4775]: [+]poststarthook/openshift.io-startinformers ok Dec 16 14:57:17 crc kubenswrapper[4775]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 16 14:57:17 crc kubenswrapper[4775]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 16 14:57:17 crc kubenswrapper[4775]: livez check failed Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.409806 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" podUID="d55038e1-9978-48f8-b430-78c7da1ca5e5" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.441414 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:17 crc kubenswrapper[4775]: E1216 14:57:17.442203 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:17.942184609 +0000 UTC m=+162.893263532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.550150 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:17 crc kubenswrapper[4775]: E1216 14:57:17.550735 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:18.050722002 +0000 UTC m=+163.001800925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.651342 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:17 crc kubenswrapper[4775]: E1216 14:57:17.651477 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:18.15144491 +0000 UTC m=+163.102523833 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.651679 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:17 crc kubenswrapper[4775]: E1216 14:57:17.652021 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:18.152009928 +0000 UTC m=+163.103088851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.753128 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:17 crc kubenswrapper[4775]: E1216 14:57:17.753483 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:18.253465088 +0000 UTC m=+163.204544011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.854746 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:17 crc kubenswrapper[4775]: E1216 14:57:17.855450 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:18.355432795 +0000 UTC m=+163.306511718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.927706 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dh2bb"] Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.928808 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dh2bb" Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.931785 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.952689 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dh2bb"] Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.959962 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:17 crc kubenswrapper[4775]: E1216 14:57:17.960440 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:18.460422406 +0000 UTC m=+163.411501329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.978025 4775 patch_prober.go:28] interesting pod/router-default-5444994796-dw4sb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:57:17 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Dec 16 14:57:17 crc kubenswrapper[4775]: [+]process-running ok Dec 16 14:57:17 crc kubenswrapper[4775]: healthz check failed Dec 16 14:57:17 crc kubenswrapper[4775]: I1216 14:57:17.978081 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dw4sb" podUID="5d72e9f6-db87-4192-8f08-32788a4ad601" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.062537 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae4804bb-2669-48fc-aa42-3e4f1c94323b-catalog-content\") pod \"certified-operators-dh2bb\" (UID: \"ae4804bb-2669-48fc-aa42-3e4f1c94323b\") " pod="openshift-marketplace/certified-operators-dh2bb" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.062605 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4dgn\" (UniqueName: \"kubernetes.io/projected/ae4804bb-2669-48fc-aa42-3e4f1c94323b-kube-api-access-v4dgn\") pod \"certified-operators-dh2bb\" (UID: \"ae4804bb-2669-48fc-aa42-3e4f1c94323b\") " pod="openshift-marketplace/certified-operators-dh2bb" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.062631 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.062676 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae4804bb-2669-48fc-aa42-3e4f1c94323b-utilities\") pod \"certified-operators-dh2bb\" (UID: \"ae4804bb-2669-48fc-aa42-3e4f1c94323b\") " pod="openshift-marketplace/certified-operators-dh2bb" Dec 16 14:57:18 crc kubenswrapper[4775]: E1216 14:57:18.063181 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:18.563165248 +0000 UTC m=+163.514244171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.127239 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t98bs" event={"ID":"26e83196-9fbb-41ab-a359-d404437ee1e9","Type":"ContainerStarted","Data":"fde176634fbe8791224b8c7d2e5464596d53bcb6a2cffd027d3df7faaa751e4a"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.145233 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649" event={"ID":"49286131-fb39-4261-bc4f-db68474c8fa0","Type":"ContainerStarted","Data":"9b684962a0ff7e87d1686ee15632b4988b3deae38776a5ff0b2179df6c1b06da"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.151580 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-74649" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.168788 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:18 crc kubenswrapper[4775]: E1216 14:57:18.169383 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:18.669359927 +0000 UTC m=+163.620438850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.169791 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae4804bb-2669-48fc-aa42-3e4f1c94323b-catalog-content\") pod \"certified-operators-dh2bb\" (UID: \"ae4804bb-2669-48fc-aa42-3e4f1c94323b\") " pod="openshift-marketplace/certified-operators-dh2bb" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.169943 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4dgn\" (UniqueName: \"kubernetes.io/projected/ae4804bb-2669-48fc-aa42-3e4f1c94323b-kube-api-access-v4dgn\") pod \"certified-operators-dh2bb\" (UID: \"ae4804bb-2669-48fc-aa42-3e4f1c94323b\") " pod="openshift-marketplace/certified-operators-dh2bb" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.170056 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.170172 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae4804bb-2669-48fc-aa42-3e4f1c94323b-utilities\") pod \"certified-operators-dh2bb\" (UID: \"ae4804bb-2669-48fc-aa42-3e4f1c94323b\") " pod="openshift-marketplace/certified-operators-dh2bb" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.170760 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae4804bb-2669-48fc-aa42-3e4f1c94323b-utilities\") pod \"certified-operators-dh2bb\" (UID: \"ae4804bb-2669-48fc-aa42-3e4f1c94323b\") " pod="openshift-marketplace/certified-operators-dh2bb" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.171157 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae4804bb-2669-48fc-aa42-3e4f1c94323b-catalog-content\") pod \"certified-operators-dh2bb\" (UID: \"ae4804bb-2669-48fc-aa42-3e4f1c94323b\") " pod="openshift-marketplace/certified-operators-dh2bb" Dec 16 14:57:18 crc kubenswrapper[4775]: E1216 14:57:18.171830 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:18.671820054 +0000 UTC m=+163.622898977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.179246 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-54js6" event={"ID":"e45773ce-d026-4240-bde4-17339b57ef93","Type":"ContainerStarted","Data":"4a758a8f72498bd63540ea8001797f95d3facf04bcfee26ec2bb91f1fadd2937"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.220240 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4dgn\" (UniqueName: \"kubernetes.io/projected/ae4804bb-2669-48fc-aa42-3e4f1c94323b-kube-api-access-v4dgn\") pod \"certified-operators-dh2bb\" (UID: \"ae4804bb-2669-48fc-aa42-3e4f1c94323b\") " pod="openshift-marketplace/certified-operators-dh2bb" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.228962 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" event={"ID":"a35488ea-bcad-480f-9e21-756619b1ed3b","Type":"ContainerStarted","Data":"151c987b585c6a4fe60481d53b7b5b0ba66b0e280a2e3129d164b11cf73b5b64"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.229013 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" event={"ID":"a35488ea-bcad-480f-9e21-756619b1ed3b","Type":"ContainerStarted","Data":"ea541718b5a809c503c6349b45ab44c0de28f10d7d0cb441bcf67db0230264eb"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.259149 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-54js6" podStartSLOduration=140.259129342 podStartE2EDuration="2m20.259129342s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:18.208113112 +0000 UTC m=+163.159192055" watchObservedRunningTime="2025-12-16 14:57:18.259129342 +0000 UTC m=+163.210208265" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.260218 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2r426" podStartSLOduration=140.260210135 podStartE2EDuration="2m20.260210135s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:18.259996288 +0000 UTC m=+163.211075261" watchObservedRunningTime="2025-12-16 14:57:18.260210135 +0000 UTC m=+163.211289058" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.267407 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dh2bb" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.274268 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:18 crc kubenswrapper[4775]: E1216 14:57:18.275751 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:18.775731832 +0000 UTC m=+163.726810745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.294312 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-t668z" event={"ID":"88c27c93-f497-45d7-85f4-8e0804225421","Type":"ContainerStarted","Data":"fbdfb0a693f6a4bbcf39eb0052cffb6cdc9b3afc02052bd30167f670f2f5338d"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.312213 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455" event={"ID":"a1734b68-7b3d-49f0-9398-879da24fa19d","Type":"ContainerStarted","Data":"cd47c42773fde89848aef40302bc63d5eda4f961265c90cc888a03e584302595"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.313253 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455" event={"ID":"a1734b68-7b3d-49f0-9398-879da24fa19d","Type":"ContainerStarted","Data":"f338d89ac587d68772c28a4548095e8a48de9d14981f753c2c0766e2554498af"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.347020 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-t668z" podStartSLOduration=140.346999006 podStartE2EDuration="2m20.346999006s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:18.328013051 +0000 UTC m=+163.279091984" watchObservedRunningTime="2025-12-16 14:57:18.346999006 +0000 UTC m=+163.298077929" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.348362 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dm57c"] Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.349278 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dm57c" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.354100 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ql6wt" event={"ID":"0eaef920-ee71-48f6-b577-02528a4ec363","Type":"ContainerStarted","Data":"145de36c89279c3bd2e9d46141133f587ca7198a479e723112cf938318a071b4"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.354164 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ql6wt" event={"ID":"0eaef920-ee71-48f6-b577-02528a4ec363","Type":"ContainerStarted","Data":"b7b52bafa72d7c79c4659d6f839c2678ea3b4b91fa16cbe849366eccbe48f832"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.369578 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dm57c"] Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.382150 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.382176 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455" podStartSLOduration=141.382157749 podStartE2EDuration="2m21.382157749s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:18.381000252 +0000 UTC m=+163.332079175" watchObservedRunningTime="2025-12-16 14:57:18.382157749 +0000 UTC m=+163.333236672" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.386424 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" event={"ID":"be0d6859-aa4c-4a58-97ea-3f3657d4773f","Type":"ContainerStarted","Data":"144365e249cc2b530351d89cae4fdc913d52f709d7880061d1b3b90f64a316ed"} Dec 16 14:57:18 crc kubenswrapper[4775]: E1216 14:57:18.389157 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:18.889137677 +0000 UTC m=+163.840216650 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.446506 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd" event={"ID":"5f57ccda-46a6-46ca-883a-0ae41ed65a07","Type":"ContainerStarted","Data":"657c281f1a309cb8bb6837c375271f6904e69793c1c9da699c1d8fe62c3ed15e"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.446860 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd" event={"ID":"5f57ccda-46a6-46ca-883a-0ae41ed65a07","Type":"ContainerStarted","Data":"622dac7684556e1824aa9d1ea13f51103c6780800839226a0b10eaca485bcde8"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.470667 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rfplx" event={"ID":"b8afbcbd-ab0e-44cf-98fc-fa5c2c50647a","Type":"ContainerStarted","Data":"9c3dfa4d11d4b26000b9aafa7c62dcc85e25ac7b250a799e6b02222feabeaf38"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.480723 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ql6wt" podStartSLOduration=140.480704609 podStartE2EDuration="2m20.480704609s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:18.479430568 +0000 UTC m=+163.430509481" watchObservedRunningTime="2025-12-16 14:57:18.480704609 +0000 UTC m=+163.431783532" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.485098 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.485454 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a-catalog-content\") pod \"certified-operators-dm57c\" (UID: \"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a\") " pod="openshift-marketplace/certified-operators-dm57c" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.485495 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a-utilities\") pod \"certified-operators-dm57c\" (UID: \"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a\") " pod="openshift-marketplace/certified-operators-dm57c" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.485563 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfg7j\" (UniqueName: \"kubernetes.io/projected/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a-kube-api-access-bfg7j\") pod \"certified-operators-dm57c\" (UID: \"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a\") " pod="openshift-marketplace/certified-operators-dm57c" Dec 16 14:57:18 crc kubenswrapper[4775]: E1216 14:57:18.485742 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:18.985709735 +0000 UTC m=+163.936788718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.491357 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zmgzc" event={"ID":"11dbd98a-eb9e-4d5f-b52d-df105cbeb83c","Type":"ContainerStarted","Data":"a38804460279e4133d366815fcf144f3633bcf150f9ae8a4295d2fc2516a9f3c"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.506760 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw" event={"ID":"56428379-949d-4ba8-9b32-8ee7432abba7","Type":"ContainerStarted","Data":"ba8d8d9104a011a3f1f25f085fd29769dade7eac5a33e86f69504eac37288b3f"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.542609 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8l8g4"] Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.542927 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" podStartSLOduration=140.542915758 podStartE2EDuration="2m20.542915758s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:18.528591809 +0000 UTC m=+163.479670732" watchObservedRunningTime="2025-12-16 14:57:18.542915758 +0000 UTC m=+163.493994671" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.544752 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-d9jzf" event={"ID":"40f1d8c0-d195-457c-909e-10fd294a0bfc","Type":"ContainerStarted","Data":"30d322295c37ce113d13351e52429a40b873b1fc685cc78a84c8bd27c59d489c"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.544795 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-d9jzf" event={"ID":"40f1d8c0-d195-457c-909e-10fd294a0bfc","Type":"ContainerStarted","Data":"ada4b8ec6ba3ce54cc8bbf5107a9ec5580a659f12943119f74c818d62e12b432"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.544944 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8l8g4" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.549688 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8l8g4"] Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.584536 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.586989 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfg7j\" (UniqueName: \"kubernetes.io/projected/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a-kube-api-access-bfg7j\") pod \"certified-operators-dm57c\" (UID: \"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a\") " pod="openshift-marketplace/certified-operators-dm57c" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.587036 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef64597f-59f1-47be-afc6-aa95fb3c355c-catalog-content\") pod \"community-operators-8l8g4\" (UID: \"ef64597f-59f1-47be-afc6-aa95fb3c355c\") " pod="openshift-marketplace/community-operators-8l8g4" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.587064 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmvtx\" (UniqueName: \"kubernetes.io/projected/ef64597f-59f1-47be-afc6-aa95fb3c355c-kube-api-access-xmvtx\") pod \"community-operators-8l8g4\" (UID: \"ef64597f-59f1-47be-afc6-aa95fb3c355c\") " pod="openshift-marketplace/community-operators-8l8g4" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.587244 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.587307 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a-catalog-content\") pod \"certified-operators-dm57c\" (UID: \"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a\") " pod="openshift-marketplace/certified-operators-dm57c" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.587369 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a-utilities\") pod \"certified-operators-dm57c\" (UID: \"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a\") " pod="openshift-marketplace/certified-operators-dm57c" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.587422 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef64597f-59f1-47be-afc6-aa95fb3c355c-utilities\") pod \"community-operators-8l8g4\" (UID: \"ef64597f-59f1-47be-afc6-aa95fb3c355c\") " pod="openshift-marketplace/community-operators-8l8g4" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.589722 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a-catalog-content\") pod \"certified-operators-dm57c\" (UID: \"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a\") " pod="openshift-marketplace/certified-operators-dm57c" Dec 16 14:57:18 crc kubenswrapper[4775]: E1216 14:57:18.590199 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:19.09018444 +0000 UTC m=+164.041263363 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.590281 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v798k" event={"ID":"4e5a6b47-360c-4b64-9ba3-15edeb2006fa","Type":"ContainerStarted","Data":"bc56b8dc7b59ca1760939855364402ce7c0aff0c72edb15a9b79b2459d2ab476"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.590329 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v798k" event={"ID":"4e5a6b47-360c-4b64-9ba3-15edeb2006fa","Type":"ContainerStarted","Data":"1da91a660e65083c18b97f84eb9ac6e7c8f7bb826e3fb069f084be2822248b18"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.590650 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a-utilities\") pod \"certified-operators-dm57c\" (UID: \"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a\") " pod="openshift-marketplace/certified-operators-dm57c" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.631330 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b5vwq" event={"ID":"4b6576b0-b9f9-4599-8876-c9b7b0a60a43","Type":"ContainerStarted","Data":"b28101553cf008a320f2adc2cb3fd378def09acb4f35e63c86cd6c1158e89fb3"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.632729 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b5vwq" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.655710 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pw4wv" event={"ID":"9ed9516f-e373-480b-a645-ad35bec98fa4","Type":"ContainerStarted","Data":"f0d7ace68b5b5080c46fdaef41ce96f215c04e21356c134b7573cc196d5a9ed4"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.655764 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pw4wv" event={"ID":"9ed9516f-e373-480b-a645-ad35bec98fa4","Type":"ContainerStarted","Data":"538472a3e90f58d5980802df13b77bf7e886cc4787d92fac0adaa122fd70ba01"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.692781 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.693027 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef64597f-59f1-47be-afc6-aa95fb3c355c-catalog-content\") pod \"community-operators-8l8g4\" (UID: \"ef64597f-59f1-47be-afc6-aa95fb3c355c\") " pod="openshift-marketplace/community-operators-8l8g4" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.693051 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmvtx\" (UniqueName: \"kubernetes.io/projected/ef64597f-59f1-47be-afc6-aa95fb3c355c-kube-api-access-xmvtx\") pod \"community-operators-8l8g4\" (UID: \"ef64597f-59f1-47be-afc6-aa95fb3c355c\") " pod="openshift-marketplace/community-operators-8l8g4" Dec 16 14:57:18 crc kubenswrapper[4775]: E1216 14:57:18.693152 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:19.193122188 +0000 UTC m=+164.144201111 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.693309 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.693457 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef64597f-59f1-47be-afc6-aa95fb3c355c-utilities\") pod \"community-operators-8l8g4\" (UID: \"ef64597f-59f1-47be-afc6-aa95fb3c355c\") " pod="openshift-marketplace/community-operators-8l8g4" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.693435 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wlgbd" podStartSLOduration=140.693418377 podStartE2EDuration="2m20.693418377s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:18.584833922 +0000 UTC m=+163.535912845" watchObservedRunningTime="2025-12-16 14:57:18.693418377 +0000 UTC m=+163.644497300" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.693836 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef64597f-59f1-47be-afc6-aa95fb3c355c-catalog-content\") pod \"community-operators-8l8g4\" (UID: \"ef64597f-59f1-47be-afc6-aa95fb3c355c\") " pod="openshift-marketplace/community-operators-8l8g4" Dec 16 14:57:18 crc kubenswrapper[4775]: E1216 14:57:18.694142 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:19.194128589 +0000 UTC m=+164.145207502 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.694452 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef64597f-59f1-47be-afc6-aa95fb3c355c-utilities\") pod \"community-operators-8l8g4\" (UID: \"ef64597f-59f1-47be-afc6-aa95fb3c355c\") " pod="openshift-marketplace/community-operators-8l8g4" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.694513 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zmgzc" podStartSLOduration=140.694508771 podStartE2EDuration="2m20.694508771s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:18.69254508 +0000 UTC m=+163.643624003" watchObservedRunningTime="2025-12-16 14:57:18.694508771 +0000 UTC m=+163.645587694" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.733877 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmvtx\" (UniqueName: \"kubernetes.io/projected/ef64597f-59f1-47be-afc6-aa95fb3c355c-kube-api-access-xmvtx\") pod \"community-operators-8l8g4\" (UID: \"ef64597f-59f1-47be-afc6-aa95fb3c355c\") " pod="openshift-marketplace/community-operators-8l8g4" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.759967 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hhm9d"] Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.760945 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfg7j\" (UniqueName: \"kubernetes.io/projected/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a-kube-api-access-bfg7j\") pod \"certified-operators-dm57c\" (UID: \"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a\") " pod="openshift-marketplace/certified-operators-dm57c" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.761388 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhm9d" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.773401 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6kp7p" event={"ID":"56f482f0-f875-48f1-9a10-f06deeb5791e","Type":"ContainerStarted","Data":"2aeac84d9aa34ff93cf9fdb7294ac43ce073348378262b6163c7fb79ecb15052"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.773464 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6kp7p" event={"ID":"56f482f0-f875-48f1-9a10-f06deeb5791e","Type":"ContainerStarted","Data":"5d287be0ee208dcaad1063455e07d42a0e22e485d443fc80cc01de09b68618a2"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.773518 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hhm9d"] Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.773666 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-6kp7p" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.797255 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.797507 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce7e6431-8250-485f-a202-a781b4b719cb-utilities\") pod \"community-operators-hhm9d\" (UID: \"ce7e6431-8250-485f-a202-a781b4b719cb\") " pod="openshift-marketplace/community-operators-hhm9d" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.797544 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4w5x\" (UniqueName: \"kubernetes.io/projected/ce7e6431-8250-485f-a202-a781b4b719cb-kube-api-access-t4w5x\") pod \"community-operators-hhm9d\" (UID: \"ce7e6431-8250-485f-a202-a781b4b719cb\") " pod="openshift-marketplace/community-operators-hhm9d" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.797566 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce7e6431-8250-485f-a202-a781b4b719cb-catalog-content\") pod \"community-operators-hhm9d\" (UID: \"ce7e6431-8250-485f-a202-a781b4b719cb\") " pod="openshift-marketplace/community-operators-hhm9d" Dec 16 14:57:18 crc kubenswrapper[4775]: E1216 14:57:18.798288 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:19.298269404 +0000 UTC m=+164.249348327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.817448 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4" event={"ID":"62f09707-b36c-4651-88df-e9bc6dd527a4","Type":"ContainerStarted","Data":"d4bcd92323c981285dfab74660325f974fc76ea0515eaa8455021daa47138173"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.817526 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4" event={"ID":"62f09707-b36c-4651-88df-e9bc6dd527a4","Type":"ContainerStarted","Data":"10864bdf3c6e276cc4aa9374fb8cbc0b2b1f3e99c6ee8468e5431f8736807078"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.818996 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" event={"ID":"b36ff831-d91c-4350-a36b-bd0625ffb661","Type":"ContainerStarted","Data":"192c1b63839f42897cd25ce2219a6d4505efeac1667b70d0d68169cb9800dce5"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.819986 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.821518 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-dw4sb" event={"ID":"5d72e9f6-db87-4192-8f08-32788a4ad601","Type":"ContainerStarted","Data":"c472814be9c355cc31eeaae14f0049fe03e08cc6e9c1a34bcb1b6b9a08180715"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.822902 4775 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-58rfh container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.822937 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" podUID="b36ff831-d91c-4350-a36b-bd0625ffb661" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.823272 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" event={"ID":"f057db8e-b509-48ad-967f-2ae735085e29","Type":"ContainerStarted","Data":"9dde43f0130af443959251a1f0c16a6d939b26bea5ecef692878259b1fa9b8b0"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.823299 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" event={"ID":"f057db8e-b509-48ad-967f-2ae735085e29","Type":"ContainerStarted","Data":"984d4d9c15479573e573ccf39d427c9c86d3692009b85406eccecb4689a390c6"} Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.868497 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-d9jzf" podStartSLOduration=140.868469415 podStartE2EDuration="2m20.868469415s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:18.787389043 +0000 UTC m=+163.738467966" watchObservedRunningTime="2025-12-16 14:57:18.868469415 +0000 UTC m=+163.819548338" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.869023 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xklhw" podStartSLOduration=141.869018862 podStartE2EDuration="2m21.869018862s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:18.8414994 +0000 UTC m=+163.792578323" watchObservedRunningTime="2025-12-16 14:57:18.869018862 +0000 UTC m=+163.820097785" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.871293 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.884295 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj8mv" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.891582 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" podStartSLOduration=140.891566399 podStartE2EDuration="2m20.891566399s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:18.886997755 +0000 UTC m=+163.838076688" watchObservedRunningTime="2025-12-16 14:57:18.891566399 +0000 UTC m=+163.842645322" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.901173 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce7e6431-8250-485f-a202-a781b4b719cb-utilities\") pod \"community-operators-hhm9d\" (UID: \"ce7e6431-8250-485f-a202-a781b4b719cb\") " pod="openshift-marketplace/community-operators-hhm9d" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.901374 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4w5x\" (UniqueName: \"kubernetes.io/projected/ce7e6431-8250-485f-a202-a781b4b719cb-kube-api-access-t4w5x\") pod \"community-operators-hhm9d\" (UID: \"ce7e6431-8250-485f-a202-a781b4b719cb\") " pod="openshift-marketplace/community-operators-hhm9d" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.901463 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce7e6431-8250-485f-a202-a781b4b719cb-catalog-content\") pod \"community-operators-hhm9d\" (UID: \"ce7e6431-8250-485f-a202-a781b4b719cb\") " pod="openshift-marketplace/community-operators-hhm9d" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.901602 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.904518 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce7e6431-8250-485f-a202-a781b4b719cb-catalog-content\") pod \"community-operators-hhm9d\" (UID: \"ce7e6431-8250-485f-a202-a781b4b719cb\") " pod="openshift-marketplace/community-operators-hhm9d" Dec 16 14:57:18 crc kubenswrapper[4775]: E1216 14:57:18.914145 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:19.414125397 +0000 UTC m=+164.365204320 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.915592 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8l8g4" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.915960 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce7e6431-8250-485f-a202-a781b4b719cb-utilities\") pod \"community-operators-hhm9d\" (UID: \"ce7e6431-8250-485f-a202-a781b4b719cb\") " pod="openshift-marketplace/community-operators-hhm9d" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.981794 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4w5x\" (UniqueName: \"kubernetes.io/projected/ce7e6431-8250-485f-a202-a781b4b719cb-kube-api-access-t4w5x\") pod \"community-operators-hhm9d\" (UID: \"ce7e6431-8250-485f-a202-a781b4b719cb\") " pod="openshift-marketplace/community-operators-hhm9d" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.987045 4775 patch_prober.go:28] interesting pod/router-default-5444994796-dw4sb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:57:18 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Dec 16 14:57:18 crc kubenswrapper[4775]: [+]process-running ok Dec 16 14:57:18 crc kubenswrapper[4775]: healthz check failed Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.992580 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dw4sb" podUID="5d72e9f6-db87-4192-8f08-32788a4ad601" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:57:18 crc kubenswrapper[4775]: I1216 14:57:18.989339 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dm57c" Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.007750 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:19 crc kubenswrapper[4775]: E1216 14:57:19.008257 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:19.508238787 +0000 UTC m=+164.459317710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.020282 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" podStartSLOduration=141.020266054 podStartE2EDuration="2m21.020266054s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:19.018665153 +0000 UTC m=+163.969744096" watchObservedRunningTime="2025-12-16 14:57:19.020266054 +0000 UTC m=+163.971344977" Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.020757 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v798k" podStartSLOduration=142.02075137 podStartE2EDuration="2m22.02075137s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:18.956395322 +0000 UTC m=+163.907474265" watchObservedRunningTime="2025-12-16 14:57:19.02075137 +0000 UTC m=+163.971830293" Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.096115 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6kp7p" podStartSLOduration=9.096094752 podStartE2EDuration="9.096094752s" podCreationTimestamp="2025-12-16 14:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:19.093304234 +0000 UTC m=+164.044383157" watchObservedRunningTime="2025-12-16 14:57:19.096094752 +0000 UTC m=+164.047173675" Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.113806 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:19 crc kubenswrapper[4775]: E1216 14:57:19.114356 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:19.614343003 +0000 UTC m=+164.565421926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.120582 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dh2bb"] Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.125069 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b5vwq" podStartSLOduration=141.125049479 podStartE2EDuration="2m21.125049479s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:19.123230942 +0000 UTC m=+164.074309865" watchObservedRunningTime="2025-12-16 14:57:19.125049479 +0000 UTC m=+164.076128402" Dec 16 14:57:19 crc kubenswrapper[4775]: W1216 14:57:19.151033 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae4804bb_2669_48fc_aa42_3e4f1c94323b.slice/crio-8823a04aa97e5787bdc9218d608ccf5d912f95993f60aa6f811ee688743c00e7 WatchSource:0}: Error finding container 8823a04aa97e5787bdc9218d608ccf5d912f95993f60aa6f811ee688743c00e7: Status 404 returned error can't find the container with id 8823a04aa97e5787bdc9218d608ccf5d912f95993f60aa6f811ee688743c00e7 Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.164216 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhm9d" Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.168777 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pw4wv" podStartSLOduration=141.16876341 podStartE2EDuration="2m21.16876341s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:19.166233011 +0000 UTC m=+164.117311944" watchObservedRunningTime="2025-12-16 14:57:19.16876341 +0000 UTC m=+164.119842333" Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.213250 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-86sn4" podStartSLOduration=141.213220453 podStartE2EDuration="2m21.213220453s" podCreationTimestamp="2025-12-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:19.204479579 +0000 UTC m=+164.155558502" watchObservedRunningTime="2025-12-16 14:57:19.213220453 +0000 UTC m=+164.164299376" Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.214643 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:19 crc kubenswrapper[4775]: E1216 14:57:19.214955 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:19.714913236 +0000 UTC m=+164.665992159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.215144 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:19 crc kubenswrapper[4775]: E1216 14:57:19.215460 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:19.715452753 +0000 UTC m=+164.666531676 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.317566 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:19 crc kubenswrapper[4775]: E1216 14:57:19.317874 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:19.817836533 +0000 UTC m=+164.768915466 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.318178 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:19 crc kubenswrapper[4775]: E1216 14:57:19.318554 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:19.818538035 +0000 UTC m=+164.769616958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.419521 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:19 crc kubenswrapper[4775]: E1216 14:57:19.419918 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:19.919898763 +0000 UTC m=+164.870977686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.528580 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:19 crc kubenswrapper[4775]: E1216 14:57:19.529036 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.029019834 +0000 UTC m=+164.980098757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.607032 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dm57c"] Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.629381 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:19 crc kubenswrapper[4775]: E1216 14:57:19.629737 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.129712991 +0000 UTC m=+165.080791924 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.629773 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:19 crc kubenswrapper[4775]: E1216 14:57:19.630194 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.130183526 +0000 UTC m=+165.081262449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.712757 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8l8g4"] Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.731031 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:19 crc kubenswrapper[4775]: E1216 14:57:19.731360 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.231335447 +0000 UTC m=+165.182414370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.731592 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:19 crc kubenswrapper[4775]: E1216 14:57:19.732060 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.232043409 +0000 UTC m=+165.183122332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.833484 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:19 crc kubenswrapper[4775]: E1216 14:57:19.834010 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.333990505 +0000 UTC m=+165.285069428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.876424 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-t668z" event={"ID":"88c27c93-f497-45d7-85f4-8e0804225421","Type":"ContainerStarted","Data":"cfd384b4c885dbe1adbc1b34c30695b0e29de18f0166f387d02d143a7b03ccf1"} Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.887037 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t98bs" event={"ID":"26e83196-9fbb-41ab-a359-d404437ee1e9","Type":"ContainerStarted","Data":"3aebcaf36af89ea2b878453e4a56aff9bdd4de047427e95dd0c843aa027042d3"} Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.887093 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t98bs" event={"ID":"26e83196-9fbb-41ab-a359-d404437ee1e9","Type":"ContainerStarted","Data":"c0ca4e295d5529de73a5e77fdf4e9c871d856d2e394304dd3b35ae6111ae534d"} Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.930037 4775 generic.go:334] "Generic (PLEG): container finished" podID="ae4804bb-2669-48fc-aa42-3e4f1c94323b" containerID="58dcd0089058b8564ce80e9ea21cfdd37b7245476ab12ddf99ca4d623da254d8" exitCode=0 Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.930375 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh2bb" event={"ID":"ae4804bb-2669-48fc-aa42-3e4f1c94323b","Type":"ContainerDied","Data":"58dcd0089058b8564ce80e9ea21cfdd37b7245476ab12ddf99ca4d623da254d8"} Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.930453 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh2bb" event={"ID":"ae4804bb-2669-48fc-aa42-3e4f1c94323b","Type":"ContainerStarted","Data":"8823a04aa97e5787bdc9218d608ccf5d912f95993f60aa6f811ee688743c00e7"} Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.931677 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hhm9d"] Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.932485 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.935646 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:19 crc kubenswrapper[4775]: E1216 14:57:19.935972 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.435959992 +0000 UTC m=+165.387038905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.943611 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6kp7p" event={"ID":"56f482f0-f875-48f1-9a10-f06deeb5791e","Type":"ContainerStarted","Data":"da20c1d725951092ae8ac98b581bc44b6974ec765e5308f97c94fa9d1c528dd9"} Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.947612 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm57c" event={"ID":"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a","Type":"ContainerStarted","Data":"c7f534594cc07ac71b01dc11d5471c73e62b89304ef88e6771db91777992e1d0"} Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.949103 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l8g4" event={"ID":"ef64597f-59f1-47be-afc6-aa95fb3c355c","Type":"ContainerStarted","Data":"10874e66f0848717503d87122e93dfcf157714cebc5e058a474a5e5a2540625f"} Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.957028 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.958240 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.983744 4775 patch_prober.go:28] interesting pod/router-default-5444994796-dw4sb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:57:19 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Dec 16 14:57:19 crc kubenswrapper[4775]: [+]process-running ok Dec 16 14:57:19 crc kubenswrapper[4775]: healthz check failed Dec 16 14:57:19 crc kubenswrapper[4775]: I1216 14:57:19.983810 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dw4sb" podUID="5d72e9f6-db87-4192-8f08-32788a4ad601" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.036839 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:20 crc kubenswrapper[4775]: E1216 14:57:20.037033 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.53700495 +0000 UTC m=+165.488083873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.038932 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:20 crc kubenswrapper[4775]: E1216 14:57:20.043084 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.54307336 +0000 UTC m=+165.494152283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.140444 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:20 crc kubenswrapper[4775]: E1216 14:57:20.140982 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.640930168 +0000 UTC m=+165.592009091 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.141230 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:20 crc kubenswrapper[4775]: E1216 14:57:20.141470 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.641453944 +0000 UTC m=+165.592532867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.197683 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxc57" Dec 16 14:57:20 crc kubenswrapper[4775]: W1216 14:57:20.209087 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce7e6431_8250_485f_a202_a781b4b719cb.slice/crio-ab2c4a6c19f4aeb32307e11321422c79b48aa0ead871bb8e849e2442fb4b3aa8 WatchSource:0}: Error finding container ab2c4a6c19f4aeb32307e11321422c79b48aa0ead871bb8e849e2442fb4b3aa8: Status 404 returned error can't find the container with id ab2c4a6c19f4aeb32307e11321422c79b48aa0ead871bb8e849e2442fb4b3aa8 Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.242982 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:20 crc kubenswrapper[4775]: E1216 14:57:20.243233 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.743194154 +0000 UTC m=+165.694273077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.243490 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:20 crc kubenswrapper[4775]: E1216 14:57:20.244153 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.744143334 +0000 UTC m=+165.695222257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.306242 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4vbnc"] Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.307373 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vbnc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.309426 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.318833 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vbnc"] Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.344388 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:20 crc kubenswrapper[4775]: E1216 14:57:20.344612 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.844569873 +0000 UTC m=+165.795648796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.345016 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2be658-340d-4dc2-89b8-ee1fbde43d23-catalog-content\") pod \"redhat-marketplace-4vbnc\" (UID: \"6b2be658-340d-4dc2-89b8-ee1fbde43d23\") " pod="openshift-marketplace/redhat-marketplace-4vbnc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.345138 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2be658-340d-4dc2-89b8-ee1fbde43d23-utilities\") pod \"redhat-marketplace-4vbnc\" (UID: \"6b2be658-340d-4dc2-89b8-ee1fbde43d23\") " pod="openshift-marketplace/redhat-marketplace-4vbnc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.345259 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlqbr\" (UniqueName: \"kubernetes.io/projected/6b2be658-340d-4dc2-89b8-ee1fbde43d23-kube-api-access-rlqbr\") pod \"redhat-marketplace-4vbnc\" (UID: \"6b2be658-340d-4dc2-89b8-ee1fbde43d23\") " pod="openshift-marketplace/redhat-marketplace-4vbnc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.345377 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:20 crc kubenswrapper[4775]: E1216 14:57:20.345805 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.84578217 +0000 UTC m=+165.796861093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.446930 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:20 crc kubenswrapper[4775]: E1216 14:57:20.447134 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.947103737 +0000 UTC m=+165.898182660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.447560 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlqbr\" (UniqueName: \"kubernetes.io/projected/6b2be658-340d-4dc2-89b8-ee1fbde43d23-kube-api-access-rlqbr\") pod \"redhat-marketplace-4vbnc\" (UID: \"6b2be658-340d-4dc2-89b8-ee1fbde43d23\") " pod="openshift-marketplace/redhat-marketplace-4vbnc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.447673 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.447776 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2be658-340d-4dc2-89b8-ee1fbde43d23-catalog-content\") pod \"redhat-marketplace-4vbnc\" (UID: \"6b2be658-340d-4dc2-89b8-ee1fbde43d23\") " pod="openshift-marketplace/redhat-marketplace-4vbnc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.447928 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2be658-340d-4dc2-89b8-ee1fbde43d23-utilities\") pod \"redhat-marketplace-4vbnc\" (UID: \"6b2be658-340d-4dc2-89b8-ee1fbde43d23\") " pod="openshift-marketplace/redhat-marketplace-4vbnc" Dec 16 14:57:20 crc kubenswrapper[4775]: E1216 14:57:20.448190 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.948182081 +0000 UTC m=+165.899261004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.448511 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2be658-340d-4dc2-89b8-ee1fbde43d23-utilities\") pod \"redhat-marketplace-4vbnc\" (UID: \"6b2be658-340d-4dc2-89b8-ee1fbde43d23\") " pod="openshift-marketplace/redhat-marketplace-4vbnc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.448664 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2be658-340d-4dc2-89b8-ee1fbde43d23-catalog-content\") pod \"redhat-marketplace-4vbnc\" (UID: \"6b2be658-340d-4dc2-89b8-ee1fbde43d23\") " pod="openshift-marketplace/redhat-marketplace-4vbnc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.484293 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlqbr\" (UniqueName: \"kubernetes.io/projected/6b2be658-340d-4dc2-89b8-ee1fbde43d23-kube-api-access-rlqbr\") pod \"redhat-marketplace-4vbnc\" (UID: \"6b2be658-340d-4dc2-89b8-ee1fbde43d23\") " pod="openshift-marketplace/redhat-marketplace-4vbnc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.549150 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:20 crc kubenswrapper[4775]: E1216 14:57:20.549289 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:21.04926553 +0000 UTC m=+166.000344463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.549520 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs\") pod \"network-metrics-daemon-c6mdt\" (UID: \"3d592ae8-792f-4cc5-9a32-b278deb33810\") " pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.549577 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:20 crc kubenswrapper[4775]: E1216 14:57:20.549958 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:21.049946671 +0000 UTC m=+166.001025594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.553493 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d592ae8-792f-4cc5-9a32-b278deb33810-metrics-certs\") pod \"network-metrics-daemon-c6mdt\" (UID: \"3d592ae8-792f-4cc5-9a32-b278deb33810\") " pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.623208 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vbnc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.636631 4775 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.647571 4775 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-16T14:57:20.636656499Z","Handler":null,"Name":""} Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.650949 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:20 crc kubenswrapper[4775]: E1216 14:57:20.651171 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:21.151135624 +0000 UTC m=+166.102214547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.651253 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:20 crc kubenswrapper[4775]: E1216 14:57:20.651603 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:57:21.151590308 +0000 UTC m=+166.102669221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sxl49" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.653366 4775 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.653404 4775 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.706283 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rhnlc"] Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.707545 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rhnlc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.723651 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhnlc"] Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.753413 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.753704 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5046921-0655-4cef-b310-018ed7ea22c4-utilities\") pod \"redhat-marketplace-rhnlc\" (UID: \"d5046921-0655-4cef-b310-018ed7ea22c4\") " pod="openshift-marketplace/redhat-marketplace-rhnlc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.753754 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q287q\" (UniqueName: \"kubernetes.io/projected/d5046921-0655-4cef-b310-018ed7ea22c4-kube-api-access-q287q\") pod \"redhat-marketplace-rhnlc\" (UID: \"d5046921-0655-4cef-b310-018ed7ea22c4\") " pod="openshift-marketplace/redhat-marketplace-rhnlc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.753979 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5046921-0655-4cef-b310-018ed7ea22c4-catalog-content\") pod \"redhat-marketplace-rhnlc\" (UID: \"d5046921-0655-4cef-b310-018ed7ea22c4\") " pod="openshift-marketplace/redhat-marketplace-rhnlc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.762624 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.781552 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6mdt" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.856393 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.856498 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5046921-0655-4cef-b310-018ed7ea22c4-catalog-content\") pod \"redhat-marketplace-rhnlc\" (UID: \"d5046921-0655-4cef-b310-018ed7ea22c4\") " pod="openshift-marketplace/redhat-marketplace-rhnlc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.856526 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5046921-0655-4cef-b310-018ed7ea22c4-utilities\") pod \"redhat-marketplace-rhnlc\" (UID: \"d5046921-0655-4cef-b310-018ed7ea22c4\") " pod="openshift-marketplace/redhat-marketplace-rhnlc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.856544 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q287q\" (UniqueName: \"kubernetes.io/projected/d5046921-0655-4cef-b310-018ed7ea22c4-kube-api-access-q287q\") pod \"redhat-marketplace-rhnlc\" (UID: \"d5046921-0655-4cef-b310-018ed7ea22c4\") " pod="openshift-marketplace/redhat-marketplace-rhnlc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.858293 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5046921-0655-4cef-b310-018ed7ea22c4-catalog-content\") pod \"redhat-marketplace-rhnlc\" (UID: \"d5046921-0655-4cef-b310-018ed7ea22c4\") " pod="openshift-marketplace/redhat-marketplace-rhnlc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.858688 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5046921-0655-4cef-b310-018ed7ea22c4-utilities\") pod \"redhat-marketplace-rhnlc\" (UID: \"d5046921-0655-4cef-b310-018ed7ea22c4\") " pod="openshift-marketplace/redhat-marketplace-rhnlc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.863982 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vbnc"] Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.864380 4775 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.864430 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.888227 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q287q\" (UniqueName: \"kubernetes.io/projected/d5046921-0655-4cef-b310-018ed7ea22c4-kube-api-access-q287q\") pod \"redhat-marketplace-rhnlc\" (UID: \"d5046921-0655-4cef-b310-018ed7ea22c4\") " pod="openshift-marketplace/redhat-marketplace-rhnlc" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.896739 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sxl49\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:20 crc kubenswrapper[4775]: W1216 14:57:20.902877 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b2be658_340d_4dc2_89b8_ee1fbde43d23.slice/crio-55d930e33c096b55b10803525772dfdaa0588104d4719de2215ac7312da61919 WatchSource:0}: Error finding container 55d930e33c096b55b10803525772dfdaa0588104d4719de2215ac7312da61919: Status 404 returned error can't find the container with id 55d930e33c096b55b10803525772dfdaa0588104d4719de2215ac7312da61919 Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.955687 4775 generic.go:334] "Generic (PLEG): container finished" podID="ce7e6431-8250-485f-a202-a781b4b719cb" containerID="856fbd9b77671619e2fe79f71ff0f8d3883f9edf6d1a0bd11b1af97c5a7a900b" exitCode=0 Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.955788 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhm9d" event={"ID":"ce7e6431-8250-485f-a202-a781b4b719cb","Type":"ContainerDied","Data":"856fbd9b77671619e2fe79f71ff0f8d3883f9edf6d1a0bd11b1af97c5a7a900b"} Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.956003 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhm9d" event={"ID":"ce7e6431-8250-485f-a202-a781b4b719cb","Type":"ContainerStarted","Data":"ab2c4a6c19f4aeb32307e11321422c79b48aa0ead871bb8e849e2442fb4b3aa8"} Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.964969 4775 generic.go:334] "Generic (PLEG): container finished" podID="a1734b68-7b3d-49f0-9398-879da24fa19d" containerID="cd47c42773fde89848aef40302bc63d5eda4f961265c90cc888a03e584302595" exitCode=0 Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.965077 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455" event={"ID":"a1734b68-7b3d-49f0-9398-879da24fa19d","Type":"ContainerDied","Data":"cd47c42773fde89848aef40302bc63d5eda4f961265c90cc888a03e584302595"} Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.966767 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vbnc" event={"ID":"6b2be658-340d-4dc2-89b8-ee1fbde43d23","Type":"ContainerStarted","Data":"55d930e33c096b55b10803525772dfdaa0588104d4719de2215ac7312da61919"} Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.974798 4775 patch_prober.go:28] interesting pod/router-default-5444994796-dw4sb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:57:20 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Dec 16 14:57:20 crc kubenswrapper[4775]: [+]process-running ok Dec 16 14:57:20 crc kubenswrapper[4775]: healthz check failed Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.974840 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dw4sb" podUID="5d72e9f6-db87-4192-8f08-32788a4ad601" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.975493 4775 generic.go:334] "Generic (PLEG): container finished" podID="7c243fb1-03dc-4ed6-9fa6-418e871d8b5a" containerID="b92742153839286dee554116f17e643181486890f70533950be08edf99d3afe4" exitCode=0 Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.975586 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm57c" event={"ID":"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a","Type":"ContainerDied","Data":"b92742153839286dee554116f17e643181486890f70533950be08edf99d3afe4"} Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.979732 4775 generic.go:334] "Generic (PLEG): container finished" podID="ef64597f-59f1-47be-afc6-aa95fb3c355c" containerID="44aee6d9217e503300e08cebfa19294d0efbda3bfcfad3da2fa3d83defde84ea" exitCode=0 Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.979787 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l8g4" event={"ID":"ef64597f-59f1-47be-afc6-aa95fb3c355c","Type":"ContainerDied","Data":"44aee6d9217e503300e08cebfa19294d0efbda3bfcfad3da2fa3d83defde84ea"} Dec 16 14:57:20 crc kubenswrapper[4775]: I1216 14:57:20.985492 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t98bs" event={"ID":"26e83196-9fbb-41ab-a359-d404437ee1e9","Type":"ContainerStarted","Data":"c86311632165709f3a54f78c32e94231336242d387ce8b14f18195a215f5fde0"} Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.070014 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.085414 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c6mdt"] Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.119048 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rhnlc" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.331250 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wrkz2"] Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.333119 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wrkz2" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.334328 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sxl49"] Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.335803 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 16 14:57:21 crc kubenswrapper[4775]: W1216 14:57:21.364341 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55dc0f62_62c4_48e2_9eb9_4998ad616e7f.slice/crio-2ea92d2458be6f4dbd64fd9c099f685de6ce1472f38b3b3f40bc8663695432d3 WatchSource:0}: Error finding container 2ea92d2458be6f4dbd64fd9c099f685de6ce1472f38b3b3f40bc8663695432d3: Status 404 returned error can't find the container with id 2ea92d2458be6f4dbd64fd9c099f685de6ce1472f38b3b3f40bc8663695432d3 Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.374547 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2-catalog-content\") pod \"redhat-operators-wrkz2\" (UID: \"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2\") " pod="openshift-marketplace/redhat-operators-wrkz2" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.374956 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2-utilities\") pod \"redhat-operators-wrkz2\" (UID: \"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2\") " pod="openshift-marketplace/redhat-operators-wrkz2" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.375057 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48r5p\" (UniqueName: \"kubernetes.io/projected/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2-kube-api-access-48r5p\") pod \"redhat-operators-wrkz2\" (UID: \"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2\") " pod="openshift-marketplace/redhat-operators-wrkz2" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.412327 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.412965 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wrkz2"] Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.476710 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48r5p\" (UniqueName: \"kubernetes.io/projected/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2-kube-api-access-48r5p\") pod \"redhat-operators-wrkz2\" (UID: \"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2\") " pod="openshift-marketplace/redhat-operators-wrkz2" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.476771 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2-catalog-content\") pod \"redhat-operators-wrkz2\" (UID: \"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2\") " pod="openshift-marketplace/redhat-operators-wrkz2" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.476863 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2-utilities\") pod \"redhat-operators-wrkz2\" (UID: \"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2\") " pod="openshift-marketplace/redhat-operators-wrkz2" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.477335 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2-utilities\") pod \"redhat-operators-wrkz2\" (UID: \"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2\") " pod="openshift-marketplace/redhat-operators-wrkz2" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.477563 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2-catalog-content\") pod \"redhat-operators-wrkz2\" (UID: \"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2\") " pod="openshift-marketplace/redhat-operators-wrkz2" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.501727 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48r5p\" (UniqueName: \"kubernetes.io/projected/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2-kube-api-access-48r5p\") pod \"redhat-operators-wrkz2\" (UID: \"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2\") " pod="openshift-marketplace/redhat-operators-wrkz2" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.523259 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhnlc"] Dec 16 14:57:21 crc kubenswrapper[4775]: W1216 14:57:21.531988 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5046921_0655_4cef_b310_018ed7ea22c4.slice/crio-8d03d66851101c7e1d4291882e5498e4c3bf60ac28b176277e427b376df4a0ff WatchSource:0}: Error finding container 8d03d66851101c7e1d4291882e5498e4c3bf60ac28b176277e427b376df4a0ff: Status 404 returned error can't find the container with id 8d03d66851101c7e1d4291882e5498e4c3bf60ac28b176277e427b376df4a0ff Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.680280 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wrkz2" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.705653 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xmqlw"] Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.706983 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmqlw" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.761384 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xmqlw"] Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.782425 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjfff\" (UniqueName: \"kubernetes.io/projected/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c-kube-api-access-bjfff\") pod \"redhat-operators-xmqlw\" (UID: \"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c\") " pod="openshift-marketplace/redhat-operators-xmqlw" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.782476 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c-catalog-content\") pod \"redhat-operators-xmqlw\" (UID: \"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c\") " pod="openshift-marketplace/redhat-operators-xmqlw" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.782524 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c-utilities\") pod \"redhat-operators-xmqlw\" (UID: \"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c\") " pod="openshift-marketplace/redhat-operators-xmqlw" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.883560 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjfff\" (UniqueName: \"kubernetes.io/projected/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c-kube-api-access-bjfff\") pod \"redhat-operators-xmqlw\" (UID: \"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c\") " pod="openshift-marketplace/redhat-operators-xmqlw" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.883625 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c-catalog-content\") pod \"redhat-operators-xmqlw\" (UID: \"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c\") " pod="openshift-marketplace/redhat-operators-xmqlw" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.883709 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c-utilities\") pod \"redhat-operators-xmqlw\" (UID: \"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c\") " pod="openshift-marketplace/redhat-operators-xmqlw" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.884442 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c-utilities\") pod \"redhat-operators-xmqlw\" (UID: \"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c\") " pod="openshift-marketplace/redhat-operators-xmqlw" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.884764 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c-catalog-content\") pod \"redhat-operators-xmqlw\" (UID: \"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c\") " pod="openshift-marketplace/redhat-operators-xmqlw" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.913682 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjfff\" (UniqueName: \"kubernetes.io/projected/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c-kube-api-access-bjfff\") pod \"redhat-operators-xmqlw\" (UID: \"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c\") " pod="openshift-marketplace/redhat-operators-xmqlw" Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.973973 4775 patch_prober.go:28] interesting pod/router-default-5444994796-dw4sb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:57:21 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Dec 16 14:57:21 crc kubenswrapper[4775]: [+]process-running ok Dec 16 14:57:21 crc kubenswrapper[4775]: healthz check failed Dec 16 14:57:21 crc kubenswrapper[4775]: I1216 14:57:21.974044 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dw4sb" podUID="5d72e9f6-db87-4192-8f08-32788a4ad601" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.000178 4775 generic.go:334] "Generic (PLEG): container finished" podID="d5046921-0655-4cef-b310-018ed7ea22c4" containerID="6bfa6100877012029ae4814c41e16486361c525d2e6093f5e13570459a918943" exitCode=0 Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.000242 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhnlc" event={"ID":"d5046921-0655-4cef-b310-018ed7ea22c4","Type":"ContainerDied","Data":"6bfa6100877012029ae4814c41e16486361c525d2e6093f5e13570459a918943"} Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.000270 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhnlc" event={"ID":"d5046921-0655-4cef-b310-018ed7ea22c4","Type":"ContainerStarted","Data":"8d03d66851101c7e1d4291882e5498e4c3bf60ac28b176277e427b376df4a0ff"} Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.011622 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" event={"ID":"55dc0f62-62c4-48e2-9eb9-4998ad616e7f","Type":"ContainerStarted","Data":"c9832c81b687b5b6bba176a28f175b833d7aed51a628d7d2594e3aa9550a03e0"} Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.011662 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" event={"ID":"55dc0f62-62c4-48e2-9eb9-4998ad616e7f","Type":"ContainerStarted","Data":"2ea92d2458be6f4dbd64fd9c099f685de6ce1472f38b3b3f40bc8663695432d3"} Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.011738 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.014378 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" event={"ID":"3d592ae8-792f-4cc5-9a32-b278deb33810","Type":"ContainerStarted","Data":"c85dec58df574e3cb6a2ad0f5b59db33f947d182ea7f87146873c7c3fecbcc1d"} Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.014411 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" event={"ID":"3d592ae8-792f-4cc5-9a32-b278deb33810","Type":"ContainerStarted","Data":"186ee503ad402927bd374a4c57a399b35adfeea1fde7448f6671f8d5ec4d93da"} Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.021679 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t98bs" event={"ID":"26e83196-9fbb-41ab-a359-d404437ee1e9","Type":"ContainerStarted","Data":"f17b031312ba1a227d254ad27caaddacf861a6f690eacb145d486d0abc1b7535"} Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.034049 4775 generic.go:334] "Generic (PLEG): container finished" podID="6b2be658-340d-4dc2-89b8-ee1fbde43d23" containerID="ea057c9d189586afd6c603cc5fa8b9aabf053b1f8e2c4a03fd2e693052c0a487" exitCode=0 Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.035192 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vbnc" event={"ID":"6b2be658-340d-4dc2-89b8-ee1fbde43d23","Type":"ContainerDied","Data":"ea057c9d189586afd6c603cc5fa8b9aabf053b1f8e2c4a03fd2e693052c0a487"} Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.044012 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" podStartSLOduration=145.043988202 podStartE2EDuration="2m25.043988202s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:22.041727071 +0000 UTC m=+166.992806004" watchObservedRunningTime="2025-12-16 14:57:22.043988202 +0000 UTC m=+166.995067125" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.093281 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-t98bs" podStartSLOduration=12.093259446 podStartE2EDuration="12.093259446s" podCreationTimestamp="2025-12-16 14:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:22.075832929 +0000 UTC m=+167.026911862" watchObservedRunningTime="2025-12-16 14:57:22.093259446 +0000 UTC m=+167.044338369" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.154434 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmqlw" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.160165 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wrkz2"] Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.364557 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.370822 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.378716 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.379392 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.409088 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.426799 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.436992 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-nxkgw" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.517254 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bb9581e-a66d-4235-a1c7-21aa3e149137-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7bb9581e-a66d-4235-a1c7-21aa3e149137\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.517423 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bb9581e-a66d-4235-a1c7-21aa3e149137-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7bb9581e-a66d-4235-a1c7-21aa3e149137\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.557763 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.618781 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1734b68-7b3d-49f0-9398-879da24fa19d-config-volume\") pod \"a1734b68-7b3d-49f0-9398-879da24fa19d\" (UID: \"a1734b68-7b3d-49f0-9398-879da24fa19d\") " Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.618920 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q68lc\" (UniqueName: \"kubernetes.io/projected/a1734b68-7b3d-49f0-9398-879da24fa19d-kube-api-access-q68lc\") pod \"a1734b68-7b3d-49f0-9398-879da24fa19d\" (UID: \"a1734b68-7b3d-49f0-9398-879da24fa19d\") " Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.619003 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1734b68-7b3d-49f0-9398-879da24fa19d-secret-volume\") pod \"a1734b68-7b3d-49f0-9398-879da24fa19d\" (UID: \"a1734b68-7b3d-49f0-9398-879da24fa19d\") " Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.619170 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bb9581e-a66d-4235-a1c7-21aa3e149137-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7bb9581e-a66d-4235-a1c7-21aa3e149137\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.619248 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bb9581e-a66d-4235-a1c7-21aa3e149137-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7bb9581e-a66d-4235-a1c7-21aa3e149137\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.619337 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bb9581e-a66d-4235-a1c7-21aa3e149137-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7bb9581e-a66d-4235-a1c7-21aa3e149137\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.619842 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1734b68-7b3d-49f0-9398-879da24fa19d-config-volume" (OuterVolumeSpecName: "config-volume") pod "a1734b68-7b3d-49f0-9398-879da24fa19d" (UID: "a1734b68-7b3d-49f0-9398-879da24fa19d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.632141 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1734b68-7b3d-49f0-9398-879da24fa19d-kube-api-access-q68lc" (OuterVolumeSpecName: "kube-api-access-q68lc") pod "a1734b68-7b3d-49f0-9398-879da24fa19d" (UID: "a1734b68-7b3d-49f0-9398-879da24fa19d"). InnerVolumeSpecName "kube-api-access-q68lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.648131 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1734b68-7b3d-49f0-9398-879da24fa19d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a1734b68-7b3d-49f0-9398-879da24fa19d" (UID: "a1734b68-7b3d-49f0-9398-879da24fa19d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.654473 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bb9581e-a66d-4235-a1c7-21aa3e149137-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7bb9581e-a66d-4235-a1c7-21aa3e149137\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.721621 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1734b68-7b3d-49f0-9398-879da24fa19d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.721654 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q68lc\" (UniqueName: \"kubernetes.io/projected/a1734b68-7b3d-49f0-9398-879da24fa19d-kube-api-access-q68lc\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.721664 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1734b68-7b3d-49f0-9398-879da24fa19d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.721756 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.736705 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.737066 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.774356 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.793489 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.924614 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xmqlw"] Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.960114 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-5bmtw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.960441 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5bmtw" podUID="edb75c2a-9e6f-4a80-aadd-38416ba9c9a4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.960192 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-5bmtw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.960648 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5bmtw" podUID="edb75c2a-9e6f-4a80-aadd-38416ba9c9a4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.986755 4775 patch_prober.go:28] interesting pod/router-default-5444994796-dw4sb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:57:22 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Dec 16 14:57:22 crc kubenswrapper[4775]: [+]process-running ok Dec 16 14:57:22 crc kubenswrapper[4775]: healthz check failed Dec 16 14:57:22 crc kubenswrapper[4775]: I1216 14:57:22.986828 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dw4sb" podUID="5d72e9f6-db87-4192-8f08-32788a4ad601" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:57:23 crc kubenswrapper[4775]: I1216 14:57:23.092052 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmqlw" event={"ID":"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c","Type":"ContainerStarted","Data":"d62251ce7352026a2de91e2ca25f72131d43f698370d18e84c66aedc49dfc5f1"} Dec 16 14:57:23 crc kubenswrapper[4775]: I1216 14:57:23.101267 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455" event={"ID":"a1734b68-7b3d-49f0-9398-879da24fa19d","Type":"ContainerDied","Data":"f338d89ac587d68772c28a4548095e8a48de9d14981f753c2c0766e2554498af"} Dec 16 14:57:23 crc kubenswrapper[4775]: I1216 14:57:23.101308 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f338d89ac587d68772c28a4548095e8a48de9d14981f753c2c0766e2554498af" Dec 16 14:57:23 crc kubenswrapper[4775]: I1216 14:57:23.101375 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455" Dec 16 14:57:23 crc kubenswrapper[4775]: I1216 14:57:23.140795 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrkz2" event={"ID":"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2","Type":"ContainerStarted","Data":"93821fab426e8805af45bae51a3f590c5dcbf9706873896e55731e596df31a3e"} Dec 16 14:57:23 crc kubenswrapper[4775]: I1216 14:57:23.140856 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrkz2" event={"ID":"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2","Type":"ContainerStarted","Data":"f0450d85832982a65eae76e07b9516ec9bbbd353f005d8ee8952cf05a5e90fe0"} Dec 16 14:57:23 crc kubenswrapper[4775]: I1216 14:57:23.164318 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c6mdt" event={"ID":"3d592ae8-792f-4cc5-9a32-b278deb33810","Type":"ContainerStarted","Data":"3c6082b93dc62ecd5ad7d9dd69f689a0a518cad9613650f99d217ede84629e64"} Dec 16 14:57:23 crc kubenswrapper[4775]: I1216 14:57:23.192330 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hr98m" Dec 16 14:57:23 crc kubenswrapper[4775]: I1216 14:57:23.226377 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-c6mdt" podStartSLOduration=146.22636036 podStartE2EDuration="2m26.22636036s" podCreationTimestamp="2025-12-16 14:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:23.225288167 +0000 UTC m=+168.176367100" watchObservedRunningTime="2025-12-16 14:57:23.22636036 +0000 UTC m=+168.177439283" Dec 16 14:57:23 crc kubenswrapper[4775]: I1216 14:57:23.241373 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:23 crc kubenswrapper[4775]: I1216 14:57:23.241422 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:23 crc kubenswrapper[4775]: I1216 14:57:23.250048 4775 patch_prober.go:28] interesting pod/console-f9d7485db-fc2jr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Dec 16 14:57:23 crc kubenswrapper[4775]: I1216 14:57:23.250116 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-fc2jr" podUID="c21af7b0-6f27-43de-8c44-6e6519262019" containerName="console" probeResult="failure" output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" Dec 16 14:57:23 crc kubenswrapper[4775]: I1216 14:57:23.546580 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 16 14:57:23 crc kubenswrapper[4775]: I1216 14:57:23.970896 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:23 crc kubenswrapper[4775]: I1216 14:57:23.977077 4775 patch_prober.go:28] interesting pod/router-default-5444994796-dw4sb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:57:23 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Dec 16 14:57:23 crc kubenswrapper[4775]: [+]process-running ok Dec 16 14:57:23 crc kubenswrapper[4775]: healthz check failed Dec 16 14:57:23 crc kubenswrapper[4775]: I1216 14:57:23.977163 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dw4sb" podUID="5d72e9f6-db87-4192-8f08-32788a4ad601" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.252751 4775 generic.go:334] "Generic (PLEG): container finished" podID="f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2" containerID="93821fab426e8805af45bae51a3f590c5dcbf9706873896e55731e596df31a3e" exitCode=0 Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.252879 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrkz2" event={"ID":"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2","Type":"ContainerDied","Data":"93821fab426e8805af45bae51a3f590c5dcbf9706873896e55731e596df31a3e"} Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.270338 4775 generic.go:334] "Generic (PLEG): container finished" podID="a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c" containerID="32c4b76138b7d1251a29272c2f8ab0ce4cba04bd48729d29e94a8338acf4f447" exitCode=0 Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.270453 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmqlw" event={"ID":"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c","Type":"ContainerDied","Data":"32c4b76138b7d1251a29272c2f8ab0ce4cba04bd48729d29e94a8338acf4f447"} Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.297172 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7bb9581e-a66d-4235-a1c7-21aa3e149137","Type":"ContainerStarted","Data":"165bb2f994d295be04751cc64b1e6a5ac176a0535ded15bfb7851e0a19d22786"} Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.677816 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 16 14:57:24 crc kubenswrapper[4775]: E1216 14:57:24.678343 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1734b68-7b3d-49f0-9398-879da24fa19d" containerName="collect-profiles" Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.678358 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1734b68-7b3d-49f0-9398-879da24fa19d" containerName="collect-profiles" Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.678508 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1734b68-7b3d-49f0-9398-879da24fa19d" containerName="collect-profiles" Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.680672 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.687048 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.687523 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.687523 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.710761 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c63468ee-465d-48df-b53e-ca5866d5a7ca-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c63468ee-465d-48df-b53e-ca5866d5a7ca\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.710819 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c63468ee-465d-48df-b53e-ca5866d5a7ca-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c63468ee-465d-48df-b53e-ca5866d5a7ca\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.812268 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c63468ee-465d-48df-b53e-ca5866d5a7ca-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c63468ee-465d-48df-b53e-ca5866d5a7ca\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.812358 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c63468ee-465d-48df-b53e-ca5866d5a7ca-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c63468ee-465d-48df-b53e-ca5866d5a7ca\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.812820 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c63468ee-465d-48df-b53e-ca5866d5a7ca-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c63468ee-465d-48df-b53e-ca5866d5a7ca\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.843038 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c63468ee-465d-48df-b53e-ca5866d5a7ca-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c63468ee-465d-48df-b53e-ca5866d5a7ca\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.976440 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:24 crc kubenswrapper[4775]: I1216 14:57:24.991516 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-dw4sb" Dec 16 14:57:25 crc kubenswrapper[4775]: I1216 14:57:25.011589 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 14:57:25 crc kubenswrapper[4775]: I1216 14:57:25.368557 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7bb9581e-a66d-4235-a1c7-21aa3e149137","Type":"ContainerStarted","Data":"21cf1db5ccfac585575522a2123384d8136584743a726ec69cef70e8fcccc5ff"} Dec 16 14:57:25 crc kubenswrapper[4775]: I1216 14:57:25.522525 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 16 14:57:26 crc kubenswrapper[4775]: I1216 14:57:26.375603 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c63468ee-465d-48df-b53e-ca5866d5a7ca","Type":"ContainerStarted","Data":"a5835887581a4df41704b7985c618897f6d9e84411f383af7c51c97c24aad63b"} Dec 16 14:57:26 crc kubenswrapper[4775]: I1216 14:57:26.390931 4775 generic.go:334] "Generic (PLEG): container finished" podID="7bb9581e-a66d-4235-a1c7-21aa3e149137" containerID="21cf1db5ccfac585575522a2123384d8136584743a726ec69cef70e8fcccc5ff" exitCode=0 Dec 16 14:57:26 crc kubenswrapper[4775]: I1216 14:57:26.390991 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7bb9581e-a66d-4235-a1c7-21aa3e149137","Type":"ContainerDied","Data":"21cf1db5ccfac585575522a2123384d8136584743a726ec69cef70e8fcccc5ff"} Dec 16 14:57:27 crc kubenswrapper[4775]: I1216 14:57:27.083120 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:57:27 crc kubenswrapper[4775]: I1216 14:57:27.194796 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bb9581e-a66d-4235-a1c7-21aa3e149137-kube-api-access\") pod \"7bb9581e-a66d-4235-a1c7-21aa3e149137\" (UID: \"7bb9581e-a66d-4235-a1c7-21aa3e149137\") " Dec 16 14:57:27 crc kubenswrapper[4775]: I1216 14:57:27.194982 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bb9581e-a66d-4235-a1c7-21aa3e149137-kubelet-dir\") pod \"7bb9581e-a66d-4235-a1c7-21aa3e149137\" (UID: \"7bb9581e-a66d-4235-a1c7-21aa3e149137\") " Dec 16 14:57:27 crc kubenswrapper[4775]: I1216 14:57:27.195309 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bb9581e-a66d-4235-a1c7-21aa3e149137-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7bb9581e-a66d-4235-a1c7-21aa3e149137" (UID: "7bb9581e-a66d-4235-a1c7-21aa3e149137"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 14:57:27 crc kubenswrapper[4775]: I1216 14:57:27.229242 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb9581e-a66d-4235-a1c7-21aa3e149137-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7bb9581e-a66d-4235-a1c7-21aa3e149137" (UID: "7bb9581e-a66d-4235-a1c7-21aa3e149137"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:27 crc kubenswrapper[4775]: I1216 14:57:27.303574 4775 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bb9581e-a66d-4235-a1c7-21aa3e149137-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:27 crc kubenswrapper[4775]: I1216 14:57:27.303615 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bb9581e-a66d-4235-a1c7-21aa3e149137-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:27 crc kubenswrapper[4775]: I1216 14:57:27.416247 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c63468ee-465d-48df-b53e-ca5866d5a7ca","Type":"ContainerStarted","Data":"7dec8d42fe5067ddb4f1a22b3ba9197ff741057ed9d4453a113f48395479dd6d"} Dec 16 14:57:27 crc kubenswrapper[4775]: I1216 14:57:27.419310 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7bb9581e-a66d-4235-a1c7-21aa3e149137","Type":"ContainerDied","Data":"165bb2f994d295be04751cc64b1e6a5ac176a0535ded15bfb7851e0a19d22786"} Dec 16 14:57:27 crc kubenswrapper[4775]: I1216 14:57:27.419335 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="165bb2f994d295be04751cc64b1e6a5ac176a0535ded15bfb7851e0a19d22786" Dec 16 14:57:27 crc kubenswrapper[4775]: I1216 14:57:27.420003 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:57:27 crc kubenswrapper[4775]: I1216 14:57:27.440089 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.440071815 podStartE2EDuration="3.440071815s" podCreationTimestamp="2025-12-16 14:57:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:57:27.436998479 +0000 UTC m=+172.388077422" watchObservedRunningTime="2025-12-16 14:57:27.440071815 +0000 UTC m=+172.391150738" Dec 16 14:57:28 crc kubenswrapper[4775]: I1216 14:57:28.813562 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6kp7p" Dec 16 14:57:29 crc kubenswrapper[4775]: I1216 14:57:29.443538 4775 generic.go:334] "Generic (PLEG): container finished" podID="c63468ee-465d-48df-b53e-ca5866d5a7ca" containerID="7dec8d42fe5067ddb4f1a22b3ba9197ff741057ed9d4453a113f48395479dd6d" exitCode=0 Dec 16 14:57:29 crc kubenswrapper[4775]: I1216 14:57:29.443813 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c63468ee-465d-48df-b53e-ca5866d5a7ca","Type":"ContainerDied","Data":"7dec8d42fe5067ddb4f1a22b3ba9197ff741057ed9d4453a113f48395479dd6d"} Dec 16 14:57:30 crc kubenswrapper[4775]: I1216 14:57:30.924647 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 14:57:31 crc kubenswrapper[4775]: I1216 14:57:31.106014 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c63468ee-465d-48df-b53e-ca5866d5a7ca-kube-api-access\") pod \"c63468ee-465d-48df-b53e-ca5866d5a7ca\" (UID: \"c63468ee-465d-48df-b53e-ca5866d5a7ca\") " Dec 16 14:57:31 crc kubenswrapper[4775]: I1216 14:57:31.106074 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c63468ee-465d-48df-b53e-ca5866d5a7ca-kubelet-dir\") pod \"c63468ee-465d-48df-b53e-ca5866d5a7ca\" (UID: \"c63468ee-465d-48df-b53e-ca5866d5a7ca\") " Dec 16 14:57:31 crc kubenswrapper[4775]: I1216 14:57:31.106212 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c63468ee-465d-48df-b53e-ca5866d5a7ca-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c63468ee-465d-48df-b53e-ca5866d5a7ca" (UID: "c63468ee-465d-48df-b53e-ca5866d5a7ca"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 14:57:31 crc kubenswrapper[4775]: I1216 14:57:31.106926 4775 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c63468ee-465d-48df-b53e-ca5866d5a7ca-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:31 crc kubenswrapper[4775]: I1216 14:57:31.119821 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c63468ee-465d-48df-b53e-ca5866d5a7ca-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c63468ee-465d-48df-b53e-ca5866d5a7ca" (UID: "c63468ee-465d-48df-b53e-ca5866d5a7ca"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:31 crc kubenswrapper[4775]: I1216 14:57:31.207439 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c63468ee-465d-48df-b53e-ca5866d5a7ca-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:31 crc kubenswrapper[4775]: I1216 14:57:31.468448 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c63468ee-465d-48df-b53e-ca5866d5a7ca","Type":"ContainerDied","Data":"a5835887581a4df41704b7985c618897f6d9e84411f383af7c51c97c24aad63b"} Dec 16 14:57:31 crc kubenswrapper[4775]: I1216 14:57:31.468493 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5835887581a4df41704b7985c618897f6d9e84411f383af7c51c97c24aad63b" Dec 16 14:57:31 crc kubenswrapper[4775]: I1216 14:57:31.468621 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 14:57:32 crc kubenswrapper[4775]: I1216 14:57:32.869935 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 14:57:32 crc kubenswrapper[4775]: I1216 14:57:32.870248 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 14:57:32 crc kubenswrapper[4775]: I1216 14:57:32.958518 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-5bmtw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Dec 16 14:57:32 crc kubenswrapper[4775]: I1216 14:57:32.958580 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5bmtw" podUID="edb75c2a-9e6f-4a80-aadd-38416ba9c9a4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Dec 16 14:57:32 crc kubenswrapper[4775]: I1216 14:57:32.958609 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-5bmtw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Dec 16 14:57:32 crc kubenswrapper[4775]: I1216 14:57:32.958669 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5bmtw" podUID="edb75c2a-9e6f-4a80-aadd-38416ba9c9a4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Dec 16 14:57:33 crc kubenswrapper[4775]: I1216 14:57:33.299533 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:33 crc kubenswrapper[4775]: I1216 14:57:33.304642 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 14:57:35 crc kubenswrapper[4775]: I1216 14:57:35.970332 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-twhnr"] Dec 16 14:57:35 crc kubenswrapper[4775]: I1216 14:57:35.970612 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" podUID="68f674b8-b7c3-43e8-b132-7d6b881cbd31" containerName="controller-manager" containerID="cri-o://5d55f4b497106e7e0dc04a363d9e5a58d6a1d1626b5e6866bc391e04e0d5b7c8" gracePeriod=30 Dec 16 14:57:35 crc kubenswrapper[4775]: I1216 14:57:35.973862 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd"] Dec 16 14:57:35 crc kubenswrapper[4775]: I1216 14:57:35.974213 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" podUID="59b049c3-67e7-4fef-8a8e-b90fb5f75bba" containerName="route-controller-manager" containerID="cri-o://8ef56132fd6deb57e73e803310ee97a51302419ba6948ce3636f808409968764" gracePeriod=30 Dec 16 14:57:37 crc kubenswrapper[4775]: I1216 14:57:37.526163 4775 generic.go:334] "Generic (PLEG): container finished" podID="68f674b8-b7c3-43e8-b132-7d6b881cbd31" containerID="5d55f4b497106e7e0dc04a363d9e5a58d6a1d1626b5e6866bc391e04e0d5b7c8" exitCode=0 Dec 16 14:57:37 crc kubenswrapper[4775]: I1216 14:57:37.526255 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" event={"ID":"68f674b8-b7c3-43e8-b132-7d6b881cbd31","Type":"ContainerDied","Data":"5d55f4b497106e7e0dc04a363d9e5a58d6a1d1626b5e6866bc391e04e0d5b7c8"} Dec 16 14:57:37 crc kubenswrapper[4775]: I1216 14:57:37.527490 4775 generic.go:334] "Generic (PLEG): container finished" podID="59b049c3-67e7-4fef-8a8e-b90fb5f75bba" containerID="8ef56132fd6deb57e73e803310ee97a51302419ba6948ce3636f808409968764" exitCode=0 Dec 16 14:57:37 crc kubenswrapper[4775]: I1216 14:57:37.527540 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" event={"ID":"59b049c3-67e7-4fef-8a8e-b90fb5f75bba","Type":"ContainerDied","Data":"8ef56132fd6deb57e73e803310ee97a51302419ba6948ce3636f808409968764"} Dec 16 14:57:41 crc kubenswrapper[4775]: I1216 14:57:41.076341 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 14:57:42 crc kubenswrapper[4775]: I1216 14:57:42.203803 4775 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-dqxgd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 16 14:57:42 crc kubenswrapper[4775]: I1216 14:57:42.203963 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" podUID="59b049c3-67e7-4fef-8a8e-b90fb5f75bba" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 16 14:57:42 crc kubenswrapper[4775]: I1216 14:57:42.242238 4775 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-twhnr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 16 14:57:42 crc kubenswrapper[4775]: I1216 14:57:42.242330 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" podUID="68f674b8-b7c3-43e8-b132-7d6b881cbd31" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 16 14:57:42 crc kubenswrapper[4775]: I1216 14:57:42.977193 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-5bmtw" Dec 16 14:57:52 crc kubenswrapper[4775]: I1216 14:57:52.202787 4775 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-dqxgd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 16 14:57:52 crc kubenswrapper[4775]: I1216 14:57:52.203428 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" podUID="59b049c3-67e7-4fef-8a8e-b90fb5f75bba" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 16 14:57:52 crc kubenswrapper[4775]: I1216 14:57:52.242574 4775 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-twhnr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 16 14:57:52 crc kubenswrapper[4775]: I1216 14:57:52.242643 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" podUID="68f674b8-b7c3-43e8-b132-7d6b881cbd31" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 16 14:57:53 crc kubenswrapper[4775]: I1216 14:57:53.069157 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b5vwq" Dec 16 14:58:00 crc kubenswrapper[4775]: I1216 14:58:00.074577 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 16 14:58:00 crc kubenswrapper[4775]: E1216 14:58:00.075192 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb9581e-a66d-4235-a1c7-21aa3e149137" containerName="pruner" Dec 16 14:58:00 crc kubenswrapper[4775]: I1216 14:58:00.075209 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb9581e-a66d-4235-a1c7-21aa3e149137" containerName="pruner" Dec 16 14:58:00 crc kubenswrapper[4775]: E1216 14:58:00.075221 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c63468ee-465d-48df-b53e-ca5866d5a7ca" containerName="pruner" Dec 16 14:58:00 crc kubenswrapper[4775]: I1216 14:58:00.075230 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c63468ee-465d-48df-b53e-ca5866d5a7ca" containerName="pruner" Dec 16 14:58:00 crc kubenswrapper[4775]: I1216 14:58:00.075355 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bb9581e-a66d-4235-a1c7-21aa3e149137" containerName="pruner" Dec 16 14:58:00 crc kubenswrapper[4775]: I1216 14:58:00.075374 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c63468ee-465d-48df-b53e-ca5866d5a7ca" containerName="pruner" Dec 16 14:58:00 crc kubenswrapper[4775]: I1216 14:58:00.075925 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 14:58:00 crc kubenswrapper[4775]: I1216 14:58:00.080955 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 16 14:58:00 crc kubenswrapper[4775]: I1216 14:58:00.081390 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 16 14:58:00 crc kubenswrapper[4775]: I1216 14:58:00.087770 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 16 14:58:00 crc kubenswrapper[4775]: I1216 14:58:00.168556 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13d12fa6-4dd1-4786-9a14-51284cbf2f6d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"13d12fa6-4dd1-4786-9a14-51284cbf2f6d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 14:58:00 crc kubenswrapper[4775]: I1216 14:58:00.168820 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13d12fa6-4dd1-4786-9a14-51284cbf2f6d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"13d12fa6-4dd1-4786-9a14-51284cbf2f6d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 14:58:00 crc kubenswrapper[4775]: I1216 14:58:00.270517 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13d12fa6-4dd1-4786-9a14-51284cbf2f6d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"13d12fa6-4dd1-4786-9a14-51284cbf2f6d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 14:58:00 crc kubenswrapper[4775]: I1216 14:58:00.270692 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13d12fa6-4dd1-4786-9a14-51284cbf2f6d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"13d12fa6-4dd1-4786-9a14-51284cbf2f6d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 14:58:00 crc kubenswrapper[4775]: I1216 14:58:00.270788 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13d12fa6-4dd1-4786-9a14-51284cbf2f6d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"13d12fa6-4dd1-4786-9a14-51284cbf2f6d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 14:58:00 crc kubenswrapper[4775]: I1216 14:58:00.298654 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13d12fa6-4dd1-4786-9a14-51284cbf2f6d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"13d12fa6-4dd1-4786-9a14-51284cbf2f6d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 14:58:00 crc kubenswrapper[4775]: I1216 14:58:00.401849 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 14:58:02 crc kubenswrapper[4775]: I1216 14:58:02.203460 4775 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-dqxgd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 16 14:58:02 crc kubenswrapper[4775]: I1216 14:58:02.203523 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" podUID="59b049c3-67e7-4fef-8a8e-b90fb5f75bba" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 16 14:58:02 crc kubenswrapper[4775]: I1216 14:58:02.241850 4775 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-twhnr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 16 14:58:02 crc kubenswrapper[4775]: I1216 14:58:02.241973 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" podUID="68f674b8-b7c3-43e8-b132-7d6b881cbd31" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 16 14:58:02 crc kubenswrapper[4775]: I1216 14:58:02.869000 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 14:58:02 crc kubenswrapper[4775]: I1216 14:58:02.869067 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 14:58:02 crc kubenswrapper[4775]: I1216 14:58:02.869122 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 14:58:02 crc kubenswrapper[4775]: I1216 14:58:02.869781 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67"} pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 14:58:02 crc kubenswrapper[4775]: I1216 14:58:02.869935 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" containerID="cri-o://e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67" gracePeriod=600 Dec 16 14:58:05 crc kubenswrapper[4775]: I1216 14:58:05.680040 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 16 14:58:05 crc kubenswrapper[4775]: I1216 14:58:05.681242 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 14:58:05 crc kubenswrapper[4775]: I1216 14:58:05.687443 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 16 14:58:05 crc kubenswrapper[4775]: I1216 14:58:05.854647 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68e5173d-4139-4745-bbb1-a20286fbf0f3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"68e5173d-4139-4745-bbb1-a20286fbf0f3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 14:58:05 crc kubenswrapper[4775]: I1216 14:58:05.854758 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68e5173d-4139-4745-bbb1-a20286fbf0f3-kube-api-access\") pod \"installer-9-crc\" (UID: \"68e5173d-4139-4745-bbb1-a20286fbf0f3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 14:58:05 crc kubenswrapper[4775]: I1216 14:58:05.854879 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/68e5173d-4139-4745-bbb1-a20286fbf0f3-var-lock\") pod \"installer-9-crc\" (UID: \"68e5173d-4139-4745-bbb1-a20286fbf0f3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 14:58:05 crc kubenswrapper[4775]: I1216 14:58:05.956254 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68e5173d-4139-4745-bbb1-a20286fbf0f3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"68e5173d-4139-4745-bbb1-a20286fbf0f3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 14:58:05 crc kubenswrapper[4775]: I1216 14:58:05.956342 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68e5173d-4139-4745-bbb1-a20286fbf0f3-kube-api-access\") pod \"installer-9-crc\" (UID: \"68e5173d-4139-4745-bbb1-a20286fbf0f3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 14:58:05 crc kubenswrapper[4775]: I1216 14:58:05.956382 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/68e5173d-4139-4745-bbb1-a20286fbf0f3-var-lock\") pod \"installer-9-crc\" (UID: \"68e5173d-4139-4745-bbb1-a20286fbf0f3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 14:58:05 crc kubenswrapper[4775]: I1216 14:58:05.956413 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68e5173d-4139-4745-bbb1-a20286fbf0f3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"68e5173d-4139-4745-bbb1-a20286fbf0f3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 14:58:05 crc kubenswrapper[4775]: I1216 14:58:05.956457 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/68e5173d-4139-4745-bbb1-a20286fbf0f3-var-lock\") pod \"installer-9-crc\" (UID: \"68e5173d-4139-4745-bbb1-a20286fbf0f3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 14:58:05 crc kubenswrapper[4775]: I1216 14:58:05.984198 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68e5173d-4139-4745-bbb1-a20286fbf0f3-kube-api-access\") pod \"installer-9-crc\" (UID: \"68e5173d-4139-4745-bbb1-a20286fbf0f3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 14:58:06 crc kubenswrapper[4775]: I1216 14:58:06.008288 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 14:58:12 crc kubenswrapper[4775]: I1216 14:58:12.203184 4775 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-dqxgd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 16 14:58:12 crc kubenswrapper[4775]: I1216 14:58:12.203663 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" podUID="59b049c3-67e7-4fef-8a8e-b90fb5f75bba" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 16 14:58:12 crc kubenswrapper[4775]: I1216 14:58:12.243320 4775 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-twhnr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 16 14:58:12 crc kubenswrapper[4775]: I1216 14:58:12.243386 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" podUID="68f674b8-b7c3-43e8-b132-7d6b881cbd31" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 16 14:58:15 crc kubenswrapper[4775]: I1216 14:58:15.765595 4775 generic.go:334] "Generic (PLEG): container finished" podID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerID="e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67" exitCode=0 Dec 16 14:58:15 crc kubenswrapper[4775]: I1216 14:58:15.765687 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerDied","Data":"e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67"} Dec 16 14:58:22 crc kubenswrapper[4775]: E1216 14:58:22.489879 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 16 14:58:22 crc kubenswrapper[4775]: E1216 14:58:22.490580 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v4dgn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-dh2bb_openshift-marketplace(ae4804bb-2669-48fc-aa42-3e4f1c94323b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 14:58:22 crc kubenswrapper[4775]: E1216 14:58:22.491832 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-dh2bb" podUID="ae4804bb-2669-48fc-aa42-3e4f1c94323b" Dec 16 14:58:23 crc kubenswrapper[4775]: I1216 14:58:23.206266 4775 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-dqxgd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": context deadline exceeded" start-of-body= Dec 16 14:58:23 crc kubenswrapper[4775]: I1216 14:58:23.206336 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" podUID="59b049c3-67e7-4fef-8a8e-b90fb5f75bba" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": context deadline exceeded" Dec 16 14:58:23 crc kubenswrapper[4775]: I1216 14:58:23.241922 4775 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-twhnr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 14:58:23 crc kubenswrapper[4775]: I1216 14:58:23.242319 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" podUID="68f674b8-b7c3-43e8-b132-7d6b881cbd31" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 14:58:23 crc kubenswrapper[4775]: E1216 14:58:23.557104 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-dh2bb" podUID="ae4804bb-2669-48fc-aa42-3e4f1c94323b" Dec 16 14:58:23 crc kubenswrapper[4775]: E1216 14:58:23.618144 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 16 14:58:23 crc kubenswrapper[4775]: E1216 14:58:23.618331 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q287q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rhnlc_openshift-marketplace(d5046921-0655-4cef-b310-018ed7ea22c4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 14:58:23 crc kubenswrapper[4775]: E1216 14:58:23.619558 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rhnlc" podUID="d5046921-0655-4cef-b310-018ed7ea22c4" Dec 16 14:58:23 crc kubenswrapper[4775]: E1216 14:58:23.629798 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 16 14:58:23 crc kubenswrapper[4775]: E1216 14:58:23.629921 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rlqbr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4vbnc_openshift-marketplace(6b2be658-340d-4dc2-89b8-ee1fbde43d23): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 14:58:23 crc kubenswrapper[4775]: E1216 14:58:23.631144 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-4vbnc" podUID="6b2be658-340d-4dc2-89b8-ee1fbde43d23" Dec 16 14:58:23 crc kubenswrapper[4775]: E1216 14:58:23.740400 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 16 14:58:23 crc kubenswrapper[4775]: E1216 14:58:23.740554 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bfg7j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-dm57c_openshift-marketplace(7c243fb1-03dc-4ed6-9fa6-418e871d8b5a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 14:58:23 crc kubenswrapper[4775]: E1216 14:58:23.741903 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-dm57c" podUID="7c243fb1-03dc-4ed6-9fa6-418e871d8b5a" Dec 16 14:58:26 crc kubenswrapper[4775]: E1216 14:58:26.772371 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-dm57c" podUID="7c243fb1-03dc-4ed6-9fa6-418e871d8b5a" Dec 16 14:58:26 crc kubenswrapper[4775]: E1216 14:58:26.772520 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4vbnc" podUID="6b2be658-340d-4dc2-89b8-ee1fbde43d23" Dec 16 14:58:26 crc kubenswrapper[4775]: E1216 14:58:26.772549 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rhnlc" podUID="d5046921-0655-4cef-b310-018ed7ea22c4" Dec 16 14:58:26 crc kubenswrapper[4775]: E1216 14:58:26.848482 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 16 14:58:26 crc kubenswrapper[4775]: E1216 14:58:26.848841 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-48r5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-wrkz2_openshift-marketplace(f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 14:58:26 crc kubenswrapper[4775]: E1216 14:58:26.850592 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-wrkz2" podUID="f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2" Dec 16 14:58:27 crc kubenswrapper[4775]: E1216 14:58:27.006732 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 16 14:58:27 crc kubenswrapper[4775]: E1216 14:58:27.006957 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bjfff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xmqlw_openshift-marketplace(a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 14:58:27 crc kubenswrapper[4775]: E1216 14:58:27.008435 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xmqlw" podUID="a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c" Dec 16 14:58:28 crc kubenswrapper[4775]: E1216 14:58:28.726660 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-wrkz2" podUID="f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2" Dec 16 14:58:28 crc kubenswrapper[4775]: E1216 14:58:28.727308 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xmqlw" podUID="a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c" Dec 16 14:58:28 crc kubenswrapper[4775]: E1216 14:58:28.812878 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 16 14:58:28 crc kubenswrapper[4775]: E1216 14:58:28.813366 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xmvtx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8l8g4_openshift-marketplace(ef64597f-59f1-47be-afc6-aa95fb3c355c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 14:58:28 crc kubenswrapper[4775]: E1216 14:58:28.816036 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8l8g4" podUID="ef64597f-59f1-47be-afc6-aa95fb3c355c" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.847051 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.875962 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.880373 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" event={"ID":"68f674b8-b7c3-43e8-b132-7d6b881cbd31","Type":"ContainerDied","Data":"018ce0de313ae57dd0abfbbe0adc33e46bf4b6593bc70f3aeb449e507c9d0080"} Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.880422 4775 scope.go:117] "RemoveContainer" containerID="5d55f4b497106e7e0dc04a363d9e5a58d6a1d1626b5e6866bc391e04e0d5b7c8" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.883990 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78c5984846-crs58"] Dec 16 14:58:28 crc kubenswrapper[4775]: E1216 14:58:28.884290 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f674b8-b7c3-43e8-b132-7d6b881cbd31" containerName="controller-manager" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.884310 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f674b8-b7c3-43e8-b132-7d6b881cbd31" containerName="controller-manager" Dec 16 14:58:28 crc kubenswrapper[4775]: E1216 14:58:28.884339 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b049c3-67e7-4fef-8a8e-b90fb5f75bba" containerName="route-controller-manager" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.884348 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b049c3-67e7-4fef-8a8e-b90fb5f75bba" containerName="route-controller-manager" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.884480 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f674b8-b7c3-43e8-b132-7d6b881cbd31" containerName="controller-manager" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.884496 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="59b049c3-67e7-4fef-8a8e-b90fb5f75bba" containerName="route-controller-manager" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.885023 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.891328 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.891348 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd" event={"ID":"59b049c3-67e7-4fef-8a8e-b90fb5f75bba","Type":"ContainerDied","Data":"00702ece8e40a76fbc272a32c36598fb69e668d6d3f53164dc399a2855953951"} Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.906469 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-serving-cert\") pod \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\" (UID: \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\") " Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.906551 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68f674b8-b7c3-43e8-b132-7d6b881cbd31-config\") pod \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.906576 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68f674b8-b7c3-43e8-b132-7d6b881cbd31-client-ca\") pod \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.906628 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68f674b8-b7c3-43e8-b132-7d6b881cbd31-serving-cert\") pod \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.906659 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zq42\" (UniqueName: \"kubernetes.io/projected/68f674b8-b7c3-43e8-b132-7d6b881cbd31-kube-api-access-9zq42\") pod \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.906701 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxzrv\" (UniqueName: \"kubernetes.io/projected/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-kube-api-access-vxzrv\") pod \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\" (UID: \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\") " Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.906727 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-client-ca\") pod \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\" (UID: \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\") " Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.906801 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/68f674b8-b7c3-43e8-b132-7d6b881cbd31-proxy-ca-bundles\") pod \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\" (UID: \"68f674b8-b7c3-43e8-b132-7d6b881cbd31\") " Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.906822 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-config\") pod \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\" (UID: \"59b049c3-67e7-4fef-8a8e-b90fb5f75bba\") " Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.907044 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-config\") pod \"controller-manager-78c5984846-crs58\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.907066 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-proxy-ca-bundles\") pod \"controller-manager-78c5984846-crs58\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.907094 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-serving-cert\") pod \"controller-manager-78c5984846-crs58\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.907143 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-client-ca\") pod \"controller-manager-78c5984846-crs58\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:28 crc kubenswrapper[4775]: E1216 14:58:28.907130 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8l8g4" podUID="ef64597f-59f1-47be-afc6-aa95fb3c355c" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.907175 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f64k8\" (UniqueName: \"kubernetes.io/projected/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-kube-api-access-f64k8\") pod \"controller-manager-78c5984846-crs58\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.909697 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68f674b8-b7c3-43e8-b132-7d6b881cbd31-config" (OuterVolumeSpecName: "config") pod "68f674b8-b7c3-43e8-b132-7d6b881cbd31" (UID: "68f674b8-b7c3-43e8-b132-7d6b881cbd31"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.910518 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68f674b8-b7c3-43e8-b132-7d6b881cbd31-client-ca" (OuterVolumeSpecName: "client-ca") pod "68f674b8-b7c3-43e8-b132-7d6b881cbd31" (UID: "68f674b8-b7c3-43e8-b132-7d6b881cbd31"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.910575 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78c5984846-crs58"] Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.913428 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-config" (OuterVolumeSpecName: "config") pod "59b049c3-67e7-4fef-8a8e-b90fb5f75bba" (UID: "59b049c3-67e7-4fef-8a8e-b90fb5f75bba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.915736 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-client-ca" (OuterVolumeSpecName: "client-ca") pod "59b049c3-67e7-4fef-8a8e-b90fb5f75bba" (UID: "59b049c3-67e7-4fef-8a8e-b90fb5f75bba"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.923085 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "59b049c3-67e7-4fef-8a8e-b90fb5f75bba" (UID: "59b049c3-67e7-4fef-8a8e-b90fb5f75bba"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.927132 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68f674b8-b7c3-43e8-b132-7d6b881cbd31-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "68f674b8-b7c3-43e8-b132-7d6b881cbd31" (UID: "68f674b8-b7c3-43e8-b132-7d6b881cbd31"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.928830 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f674b8-b7c3-43e8-b132-7d6b881cbd31-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "68f674b8-b7c3-43e8-b132-7d6b881cbd31" (UID: "68f674b8-b7c3-43e8-b132-7d6b881cbd31"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.929286 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-kube-api-access-vxzrv" (OuterVolumeSpecName: "kube-api-access-vxzrv") pod "59b049c3-67e7-4fef-8a8e-b90fb5f75bba" (UID: "59b049c3-67e7-4fef-8a8e-b90fb5f75bba"). InnerVolumeSpecName "kube-api-access-vxzrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:58:28 crc kubenswrapper[4775]: I1216 14:58:28.930196 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68f674b8-b7c3-43e8-b132-7d6b881cbd31-kube-api-access-9zq42" (OuterVolumeSpecName: "kube-api-access-9zq42") pod "68f674b8-b7c3-43e8-b132-7d6b881cbd31" (UID: "68f674b8-b7c3-43e8-b132-7d6b881cbd31"). InnerVolumeSpecName "kube-api-access-9zq42". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:58:28 crc kubenswrapper[4775]: E1216 14:58:28.949131 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 16 14:58:28 crc kubenswrapper[4775]: E1216 14:58:28.949297 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t4w5x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hhm9d_openshift-marketplace(ce7e6431-8250-485f-a202-a781b4b719cb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 14:58:28 crc kubenswrapper[4775]: E1216 14:58:28.987987 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hhm9d" podUID="ce7e6431-8250-485f-a202-a781b4b719cb" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.009790 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-config\") pod \"controller-manager-78c5984846-crs58\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.009861 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-proxy-ca-bundles\") pod \"controller-manager-78c5984846-crs58\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.009918 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-serving-cert\") pod \"controller-manager-78c5984846-crs58\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.009964 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-client-ca\") pod \"controller-manager-78c5984846-crs58\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.010029 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f64k8\" (UniqueName: \"kubernetes.io/projected/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-kube-api-access-f64k8\") pod \"controller-manager-78c5984846-crs58\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.012155 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/68f674b8-b7c3-43e8-b132-7d6b881cbd31-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.012173 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.012184 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.012193 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68f674b8-b7c3-43e8-b132-7d6b881cbd31-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.012201 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68f674b8-b7c3-43e8-b132-7d6b881cbd31-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.012209 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68f674b8-b7c3-43e8-b132-7d6b881cbd31-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.012217 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zq42\" (UniqueName: \"kubernetes.io/projected/68f674b8-b7c3-43e8-b132-7d6b881cbd31-kube-api-access-9zq42\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.012227 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxzrv\" (UniqueName: \"kubernetes.io/projected/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-kube-api-access-vxzrv\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.012235 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59b049c3-67e7-4fef-8a8e-b90fb5f75bba-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.015478 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-config\") pod \"controller-manager-78c5984846-crs58\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.016225 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-proxy-ca-bundles\") pod \"controller-manager-78c5984846-crs58\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.017108 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-client-ca\") pod \"controller-manager-78c5984846-crs58\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.039102 4775 scope.go:117] "RemoveContainer" containerID="8ef56132fd6deb57e73e803310ee97a51302419ba6948ce3636f808409968764" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.039751 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-serving-cert\") pod \"controller-manager-78c5984846-crs58\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.067756 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f64k8\" (UniqueName: \"kubernetes.io/projected/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-kube-api-access-f64k8\") pod \"controller-manager-78c5984846-crs58\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.227187 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd"] Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.230997 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dqxgd"] Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.251579 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 16 14:58:29 crc kubenswrapper[4775]: W1216 14:58:29.256358 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod13d12fa6_4dd1_4786_9a14_51284cbf2f6d.slice/crio-cc2dff8fab839b871291aa08dcef8a20de75d249a265aabcb62ca369a89d5760 WatchSource:0}: Error finding container cc2dff8fab839b871291aa08dcef8a20de75d249a265aabcb62ca369a89d5760: Status 404 returned error can't find the container with id cc2dff8fab839b871291aa08dcef8a20de75d249a265aabcb62ca369a89d5760 Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.286873 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.349582 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59b049c3-67e7-4fef-8a8e-b90fb5f75bba" path="/var/lib/kubelet/pods/59b049c3-67e7-4fef-8a8e-b90fb5f75bba/volumes" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.351038 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 16 14:58:29 crc kubenswrapper[4775]: W1216 14:58:29.362932 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod68e5173d_4139_4745_bbb1_a20286fbf0f3.slice/crio-fff939026ac1e6efc91dd8317e6198d6794f3b2ef33a66aa2f50a57db6837cfb WatchSource:0}: Error finding container fff939026ac1e6efc91dd8317e6198d6794f3b2ef33a66aa2f50a57db6837cfb: Status 404 returned error can't find the container with id fff939026ac1e6efc91dd8317e6198d6794f3b2ef33a66aa2f50a57db6837cfb Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.524969 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78c5984846-crs58"] Dec 16 14:58:29 crc kubenswrapper[4775]: W1216 14:58:29.538768 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8d6fb76_ab24_474f_9fdb_5c5893bd6bbd.slice/crio-b94d789436880dcb38324c36709cb5693dab769812b8d11b208861be4095e098 WatchSource:0}: Error finding container b94d789436880dcb38324c36709cb5693dab769812b8d11b208861be4095e098: Status 404 returned error can't find the container with id b94d789436880dcb38324c36709cb5693dab769812b8d11b208861be4095e098 Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.899166 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-twhnr" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.903084 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"68e5173d-4139-4745-bbb1-a20286fbf0f3","Type":"ContainerStarted","Data":"7f6aa3415735c8934af5a042d801a60e84b699b4a7198703f69fd8c83e930c04"} Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.903131 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"68e5173d-4139-4745-bbb1-a20286fbf0f3","Type":"ContainerStarted","Data":"fff939026ac1e6efc91dd8317e6198d6794f3b2ef33a66aa2f50a57db6837cfb"} Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.904239 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"13d12fa6-4dd1-4786-9a14-51284cbf2f6d","Type":"ContainerStarted","Data":"5d0a8cd8fd063014057781e4d9a3eb84d227f0e25efc9f5f599b41403d82b37e"} Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.904270 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"13d12fa6-4dd1-4786-9a14-51284cbf2f6d","Type":"ContainerStarted","Data":"cc2dff8fab839b871291aa08dcef8a20de75d249a265aabcb62ca369a89d5760"} Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.905630 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c5984846-crs58" event={"ID":"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd","Type":"ContainerStarted","Data":"9bd19c8730a64f6306eeac844833bf1699005876788a0a5db8867f28181005d1"} Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.905663 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c5984846-crs58" event={"ID":"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd","Type":"ContainerStarted","Data":"b94d789436880dcb38324c36709cb5693dab769812b8d11b208861be4095e098"} Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.905809 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.908194 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerStarted","Data":"9e3465738032f654be4ca1d279ee3d3d38d1b27f93903e514f0b4016991fc071"} Dec 16 14:58:29 crc kubenswrapper[4775]: E1216 14:58:29.909201 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hhm9d" podUID="ce7e6431-8250-485f-a202-a781b4b719cb" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.915965 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.925288 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=24.925257481 podStartE2EDuration="24.925257481s" podCreationTimestamp="2025-12-16 14:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:58:29.924139224 +0000 UTC m=+234.875218157" watchObservedRunningTime="2025-12-16 14:58:29.925257481 +0000 UTC m=+234.876336414" Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.983486 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-twhnr"] Dec 16 14:58:29 crc kubenswrapper[4775]: I1216 14:58:29.992461 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-twhnr"] Dec 16 14:58:30 crc kubenswrapper[4775]: I1216 14:58:30.011281 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78c5984846-crs58" podStartSLOduration=35.011264467 podStartE2EDuration="35.011264467s" podCreationTimestamp="2025-12-16 14:57:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:58:30.008565809 +0000 UTC m=+234.959644752" watchObservedRunningTime="2025-12-16 14:58:30.011264467 +0000 UTC m=+234.962343380" Dec 16 14:58:30 crc kubenswrapper[4775]: I1216 14:58:30.915261 4775 generic.go:334] "Generic (PLEG): container finished" podID="13d12fa6-4dd1-4786-9a14-51284cbf2f6d" containerID="5d0a8cd8fd063014057781e4d9a3eb84d227f0e25efc9f5f599b41403d82b37e" exitCode=0 Dec 16 14:58:30 crc kubenswrapper[4775]: I1216 14:58:30.915328 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"13d12fa6-4dd1-4786-9a14-51284cbf2f6d","Type":"ContainerDied","Data":"5d0a8cd8fd063014057781e4d9a3eb84d227f0e25efc9f5f599b41403d82b37e"} Dec 16 14:58:30 crc kubenswrapper[4775]: I1216 14:58:30.977368 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb"] Dec 16 14:58:30 crc kubenswrapper[4775]: I1216 14:58:30.978502 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" Dec 16 14:58:30 crc kubenswrapper[4775]: I1216 14:58:30.981163 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 14:58:30 crc kubenswrapper[4775]: I1216 14:58:30.981224 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 14:58:30 crc kubenswrapper[4775]: I1216 14:58:30.981639 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 14:58:30 crc kubenswrapper[4775]: I1216 14:58:30.982349 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 14:58:30 crc kubenswrapper[4775]: I1216 14:58:30.982376 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 14:58:30 crc kubenswrapper[4775]: I1216 14:58:30.982471 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 14:58:30 crc kubenswrapper[4775]: I1216 14:58:30.995724 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb"] Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.089519 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-config\") pod \"route-controller-manager-6cc8fccf5d-jlmvb\" (UID: \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\") " pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.089572 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-client-ca\") pod \"route-controller-manager-6cc8fccf5d-jlmvb\" (UID: \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\") " pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.089621 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-serving-cert\") pod \"route-controller-manager-6cc8fccf5d-jlmvb\" (UID: \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\") " pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.089861 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j64cs\" (UniqueName: \"kubernetes.io/projected/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-kube-api-access-j64cs\") pod \"route-controller-manager-6cc8fccf5d-jlmvb\" (UID: \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\") " pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.191697 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-config\") pod \"route-controller-manager-6cc8fccf5d-jlmvb\" (UID: \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\") " pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.191818 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-client-ca\") pod \"route-controller-manager-6cc8fccf5d-jlmvb\" (UID: \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\") " pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.191918 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-serving-cert\") pod \"route-controller-manager-6cc8fccf5d-jlmvb\" (UID: \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\") " pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.192009 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j64cs\" (UniqueName: \"kubernetes.io/projected/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-kube-api-access-j64cs\") pod \"route-controller-manager-6cc8fccf5d-jlmvb\" (UID: \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\") " pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.193074 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-client-ca\") pod \"route-controller-manager-6cc8fccf5d-jlmvb\" (UID: \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\") " pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.193286 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-config\") pod \"route-controller-manager-6cc8fccf5d-jlmvb\" (UID: \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\") " pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.198026 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-serving-cert\") pod \"route-controller-manager-6cc8fccf5d-jlmvb\" (UID: \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\") " pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.209113 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j64cs\" (UniqueName: \"kubernetes.io/projected/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-kube-api-access-j64cs\") pod \"route-controller-manager-6cc8fccf5d-jlmvb\" (UID: \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\") " pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.301008 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.348715 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68f674b8-b7c3-43e8-b132-7d6b881cbd31" path="/var/lib/kubelet/pods/68f674b8-b7c3-43e8-b132-7d6b881cbd31/volumes" Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.722479 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb"] Dec 16 14:58:31 crc kubenswrapper[4775]: W1216 14:58:31.730754 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e534235_ae6b_4a4e_9bb6_0d53d33f69bc.slice/crio-17ec2aeaf1e185bdcec67cbfd4725766d0e1a2edba6cdd4998f791fe6bc744d1 WatchSource:0}: Error finding container 17ec2aeaf1e185bdcec67cbfd4725766d0e1a2edba6cdd4998f791fe6bc744d1: Status 404 returned error can't find the container with id 17ec2aeaf1e185bdcec67cbfd4725766d0e1a2edba6cdd4998f791fe6bc744d1 Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.922842 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" event={"ID":"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc","Type":"ContainerStarted","Data":"b724adc6bfd4bb154cdf2112522aecd04afcf1f8b48e34d20df417ea23f22a25"} Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.922896 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" event={"ID":"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc","Type":"ContainerStarted","Data":"17ec2aeaf1e185bdcec67cbfd4725766d0e1a2edba6cdd4998f791fe6bc744d1"} Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.923352 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" Dec 16 14:58:31 crc kubenswrapper[4775]: I1216 14:58:31.945383 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" podStartSLOduration=36.94536148 podStartE2EDuration="36.94536148s" podCreationTimestamp="2025-12-16 14:57:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:58:31.942090363 +0000 UTC m=+236.893169296" watchObservedRunningTime="2025-12-16 14:58:31.94536148 +0000 UTC m=+236.896440403" Dec 16 14:58:32 crc kubenswrapper[4775]: I1216 14:58:32.132652 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 14:58:32 crc kubenswrapper[4775]: I1216 14:58:32.209302 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13d12fa6-4dd1-4786-9a14-51284cbf2f6d-kubelet-dir\") pod \"13d12fa6-4dd1-4786-9a14-51284cbf2f6d\" (UID: \"13d12fa6-4dd1-4786-9a14-51284cbf2f6d\") " Dec 16 14:58:32 crc kubenswrapper[4775]: I1216 14:58:32.209439 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13d12fa6-4dd1-4786-9a14-51284cbf2f6d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "13d12fa6-4dd1-4786-9a14-51284cbf2f6d" (UID: "13d12fa6-4dd1-4786-9a14-51284cbf2f6d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 14:58:32 crc kubenswrapper[4775]: I1216 14:58:32.209470 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13d12fa6-4dd1-4786-9a14-51284cbf2f6d-kube-api-access\") pod \"13d12fa6-4dd1-4786-9a14-51284cbf2f6d\" (UID: \"13d12fa6-4dd1-4786-9a14-51284cbf2f6d\") " Dec 16 14:58:32 crc kubenswrapper[4775]: I1216 14:58:32.209832 4775 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13d12fa6-4dd1-4786-9a14-51284cbf2f6d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:32 crc kubenswrapper[4775]: I1216 14:58:32.215604 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13d12fa6-4dd1-4786-9a14-51284cbf2f6d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "13d12fa6-4dd1-4786-9a14-51284cbf2f6d" (UID: "13d12fa6-4dd1-4786-9a14-51284cbf2f6d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:58:32 crc kubenswrapper[4775]: I1216 14:58:32.311054 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13d12fa6-4dd1-4786-9a14-51284cbf2f6d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:32 crc kubenswrapper[4775]: I1216 14:58:32.661871 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" Dec 16 14:58:32 crc kubenswrapper[4775]: I1216 14:58:32.929151 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"13d12fa6-4dd1-4786-9a14-51284cbf2f6d","Type":"ContainerDied","Data":"cc2dff8fab839b871291aa08dcef8a20de75d249a265aabcb62ca369a89d5760"} Dec 16 14:58:32 crc kubenswrapper[4775]: I1216 14:58:32.929191 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 14:58:32 crc kubenswrapper[4775]: I1216 14:58:32.929205 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc2dff8fab839b871291aa08dcef8a20de75d249a265aabcb62ca369a89d5760" Dec 16 14:58:37 crc kubenswrapper[4775]: I1216 14:58:37.986363 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh2bb" event={"ID":"ae4804bb-2669-48fc-aa42-3e4f1c94323b","Type":"ContainerStarted","Data":"faa1a80b890f64b2a41dde8f97aabaf1e4552b03d3502e131a4c42cbe5f475f4"} Dec 16 14:58:38 crc kubenswrapper[4775]: I1216 14:58:38.993495 4775 generic.go:334] "Generic (PLEG): container finished" podID="ae4804bb-2669-48fc-aa42-3e4f1c94323b" containerID="faa1a80b890f64b2a41dde8f97aabaf1e4552b03d3502e131a4c42cbe5f475f4" exitCode=0 Dec 16 14:58:38 crc kubenswrapper[4775]: I1216 14:58:38.993630 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh2bb" event={"ID":"ae4804bb-2669-48fc-aa42-3e4f1c94323b","Type":"ContainerDied","Data":"faa1a80b890f64b2a41dde8f97aabaf1e4552b03d3502e131a4c42cbe5f475f4"} Dec 16 14:58:41 crc kubenswrapper[4775]: I1216 14:58:41.003967 4775 generic.go:334] "Generic (PLEG): container finished" podID="7c243fb1-03dc-4ed6-9fa6-418e871d8b5a" containerID="d741ffed290d091cdb6c50d9f7dcd2fc8e881279fd369afe3fac4578190d2430" exitCode=0 Dec 16 14:58:41 crc kubenswrapper[4775]: I1216 14:58:41.004070 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm57c" event={"ID":"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a","Type":"ContainerDied","Data":"d741ffed290d091cdb6c50d9f7dcd2fc8e881279fd369afe3fac4578190d2430"} Dec 16 14:58:41 crc kubenswrapper[4775]: I1216 14:58:41.008130 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh2bb" event={"ID":"ae4804bb-2669-48fc-aa42-3e4f1c94323b","Type":"ContainerStarted","Data":"0fc1dd1e460505118499bbacfea4a6ed132b1cc3b385d8b14d06a0a087a4b3b6"} Dec 16 14:58:41 crc kubenswrapper[4775]: I1216 14:58:41.039874 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dh2bb" podStartSLOduration=3.755729053 podStartE2EDuration="1m24.039852501s" podCreationTimestamp="2025-12-16 14:57:17 +0000 UTC" firstStartedPulling="2025-12-16 14:57:19.932153533 +0000 UTC m=+164.883232456" lastFinishedPulling="2025-12-16 14:58:40.216276981 +0000 UTC m=+245.167355904" observedRunningTime="2025-12-16 14:58:41.036311286 +0000 UTC m=+245.987390219" watchObservedRunningTime="2025-12-16 14:58:41.039852501 +0000 UTC m=+245.990931424" Dec 16 14:58:42 crc kubenswrapper[4775]: I1216 14:58:42.016953 4775 generic.go:334] "Generic (PLEG): container finished" podID="6b2be658-340d-4dc2-89b8-ee1fbde43d23" containerID="04426c3e77dc660a0a38af544b3e05b151570f49de7c9152482a6304bfdd8bd5" exitCode=0 Dec 16 14:58:42 crc kubenswrapper[4775]: I1216 14:58:42.017010 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vbnc" event={"ID":"6b2be658-340d-4dc2-89b8-ee1fbde43d23","Type":"ContainerDied","Data":"04426c3e77dc660a0a38af544b3e05b151570f49de7c9152482a6304bfdd8bd5"} Dec 16 14:58:43 crc kubenswrapper[4775]: I1216 14:58:43.025744 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vbnc" event={"ID":"6b2be658-340d-4dc2-89b8-ee1fbde43d23","Type":"ContainerStarted","Data":"b33a9584a38b9c1e3c9360213d802f11a4520acd114bca8b3f868ea70287a2dd"} Dec 16 14:58:43 crc kubenswrapper[4775]: I1216 14:58:43.032865 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm57c" event={"ID":"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a","Type":"ContainerStarted","Data":"cc7b3b1772643d1a152c3a976c9839c01fdc8880ebab651ae4ae843a85fc5169"} Dec 16 14:58:43 crc kubenswrapper[4775]: I1216 14:58:43.037696 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrkz2" event={"ID":"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2","Type":"ContainerStarted","Data":"e12314d0ee0b4d0c27ecaf8f3aae7dd048927c3d6027949a33f9d6627ca8deec"} Dec 16 14:58:43 crc kubenswrapper[4775]: I1216 14:58:43.039849 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhnlc" event={"ID":"d5046921-0655-4cef-b310-018ed7ea22c4","Type":"ContainerStarted","Data":"d918aa1984ca6e854a58b04cabc054886a8c4244a4e5bc5e9cb953423d39de4d"} Dec 16 14:58:43 crc kubenswrapper[4775]: I1216 14:58:43.051383 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4vbnc" podStartSLOduration=2.473260097 podStartE2EDuration="1m23.051363069s" podCreationTimestamp="2025-12-16 14:57:20 +0000 UTC" firstStartedPulling="2025-12-16 14:57:22.036233338 +0000 UTC m=+166.987312261" lastFinishedPulling="2025-12-16 14:58:42.61433632 +0000 UTC m=+247.565415233" observedRunningTime="2025-12-16 14:58:43.050375797 +0000 UTC m=+248.001454720" watchObservedRunningTime="2025-12-16 14:58:43.051363069 +0000 UTC m=+248.002441992" Dec 16 14:58:43 crc kubenswrapper[4775]: I1216 14:58:43.107806 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dm57c" podStartSLOduration=3.899363131 podStartE2EDuration="1m25.10778769s" podCreationTimestamp="2025-12-16 14:57:18 +0000 UTC" firstStartedPulling="2025-12-16 14:57:20.978417124 +0000 UTC m=+165.929496047" lastFinishedPulling="2025-12-16 14:58:42.186841683 +0000 UTC m=+247.137920606" observedRunningTime="2025-12-16 14:58:43.102807658 +0000 UTC m=+248.053886601" watchObservedRunningTime="2025-12-16 14:58:43.10778769 +0000 UTC m=+248.058866613" Dec 16 14:58:44 crc kubenswrapper[4775]: I1216 14:58:44.047630 4775 generic.go:334] "Generic (PLEG): container finished" podID="d5046921-0655-4cef-b310-018ed7ea22c4" containerID="d918aa1984ca6e854a58b04cabc054886a8c4244a4e5bc5e9cb953423d39de4d" exitCode=0 Dec 16 14:58:44 crc kubenswrapper[4775]: I1216 14:58:44.047682 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhnlc" event={"ID":"d5046921-0655-4cef-b310-018ed7ea22c4","Type":"ContainerDied","Data":"d918aa1984ca6e854a58b04cabc054886a8c4244a4e5bc5e9cb953423d39de4d"} Dec 16 14:58:45 crc kubenswrapper[4775]: I1216 14:58:45.054927 4775 generic.go:334] "Generic (PLEG): container finished" podID="f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2" containerID="e12314d0ee0b4d0c27ecaf8f3aae7dd048927c3d6027949a33f9d6627ca8deec" exitCode=0 Dec 16 14:58:45 crc kubenswrapper[4775]: I1216 14:58:45.054976 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrkz2" event={"ID":"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2","Type":"ContainerDied","Data":"e12314d0ee0b4d0c27ecaf8f3aae7dd048927c3d6027949a33f9d6627ca8deec"} Dec 16 14:58:48 crc kubenswrapper[4775]: I1216 14:58:48.269048 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dh2bb" Dec 16 14:58:48 crc kubenswrapper[4775]: I1216 14:58:48.269618 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dh2bb" Dec 16 14:58:48 crc kubenswrapper[4775]: I1216 14:58:48.387807 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dh2bb" Dec 16 14:58:48 crc kubenswrapper[4775]: I1216 14:58:48.989650 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dm57c" Dec 16 14:58:48 crc kubenswrapper[4775]: I1216 14:58:48.990118 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dm57c" Dec 16 14:58:49 crc kubenswrapper[4775]: I1216 14:58:49.041257 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dm57c" Dec 16 14:58:49 crc kubenswrapper[4775]: I1216 14:58:49.078447 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhnlc" event={"ID":"d5046921-0655-4cef-b310-018ed7ea22c4","Type":"ContainerStarted","Data":"e6c1f9c2e746ede5cb2bbd373c0b0dd407da6e3c8e55ba8c14cb6471f07c4203"} Dec 16 14:58:49 crc kubenswrapper[4775]: I1216 14:58:49.102137 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rhnlc" podStartSLOduration=2.842267964 podStartE2EDuration="1m29.102116244s" podCreationTimestamp="2025-12-16 14:57:20 +0000 UTC" firstStartedPulling="2025-12-16 14:57:22.005802274 +0000 UTC m=+166.956881197" lastFinishedPulling="2025-12-16 14:58:48.265650554 +0000 UTC m=+253.216729477" observedRunningTime="2025-12-16 14:58:49.100924375 +0000 UTC m=+254.052003318" watchObservedRunningTime="2025-12-16 14:58:49.102116244 +0000 UTC m=+254.053195167" Dec 16 14:58:49 crc kubenswrapper[4775]: I1216 14:58:49.121832 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dh2bb" Dec 16 14:58:49 crc kubenswrapper[4775]: I1216 14:58:49.125616 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dm57c" Dec 16 14:58:50 crc kubenswrapper[4775]: I1216 14:58:50.097469 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrkz2" event={"ID":"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2","Type":"ContainerStarted","Data":"4518f90a71984f29288b363af631a11d28801d3d91a5708e28b0e2653acef0e0"} Dec 16 14:58:50 crc kubenswrapper[4775]: I1216 14:58:50.624316 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4vbnc" Dec 16 14:58:50 crc kubenswrapper[4775]: I1216 14:58:50.625345 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4vbnc" Dec 16 14:58:50 crc kubenswrapper[4775]: I1216 14:58:50.682733 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4vbnc" Dec 16 14:58:51 crc kubenswrapper[4775]: I1216 14:58:51.023595 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dm57c"] Dec 16 14:58:51 crc kubenswrapper[4775]: I1216 14:58:51.102172 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dm57c" podUID="7c243fb1-03dc-4ed6-9fa6-418e871d8b5a" containerName="registry-server" containerID="cri-o://cc7b3b1772643d1a152c3a976c9839c01fdc8880ebab651ae4ae843a85fc5169" gracePeriod=2 Dec 16 14:58:51 crc kubenswrapper[4775]: I1216 14:58:51.120130 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rhnlc" Dec 16 14:58:51 crc kubenswrapper[4775]: I1216 14:58:51.120173 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rhnlc" Dec 16 14:58:51 crc kubenswrapper[4775]: I1216 14:58:51.121327 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wrkz2" podStartSLOduration=4.83927271 podStartE2EDuration="1m30.121287783s" podCreationTimestamp="2025-12-16 14:57:21 +0000 UTC" firstStartedPulling="2025-12-16 14:57:24.262135263 +0000 UTC m=+169.213214186" lastFinishedPulling="2025-12-16 14:58:49.544150346 +0000 UTC m=+254.495229259" observedRunningTime="2025-12-16 14:58:51.119451943 +0000 UTC m=+256.070530876" watchObservedRunningTime="2025-12-16 14:58:51.121287783 +0000 UTC m=+256.072366706" Dec 16 14:58:51 crc kubenswrapper[4775]: I1216 14:58:51.147769 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4vbnc" Dec 16 14:58:51 crc kubenswrapper[4775]: I1216 14:58:51.168272 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rhnlc" Dec 16 14:58:51 crc kubenswrapper[4775]: I1216 14:58:51.680826 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wrkz2" Dec 16 14:58:51 crc kubenswrapper[4775]: I1216 14:58:51.680876 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wrkz2" Dec 16 14:58:52 crc kubenswrapper[4775]: I1216 14:58:52.718635 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wrkz2" podUID="f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2" containerName="registry-server" probeResult="failure" output=< Dec 16 14:58:52 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Dec 16 14:58:52 crc kubenswrapper[4775]: > Dec 16 14:58:53 crc kubenswrapper[4775]: I1216 14:58:53.117473 4775 generic.go:334] "Generic (PLEG): container finished" podID="7c243fb1-03dc-4ed6-9fa6-418e871d8b5a" containerID="cc7b3b1772643d1a152c3a976c9839c01fdc8880ebab651ae4ae843a85fc5169" exitCode=0 Dec 16 14:58:53 crc kubenswrapper[4775]: I1216 14:58:53.117546 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm57c" event={"ID":"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a","Type":"ContainerDied","Data":"cc7b3b1772643d1a152c3a976c9839c01fdc8880ebab651ae4ae843a85fc5169"} Dec 16 14:58:53 crc kubenswrapper[4775]: I1216 14:58:53.533704 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dm57c" Dec 16 14:58:53 crc kubenswrapper[4775]: I1216 14:58:53.661038 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a-catalog-content\") pod \"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a\" (UID: \"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a\") " Dec 16 14:58:53 crc kubenswrapper[4775]: I1216 14:58:53.661168 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a-utilities\") pod \"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a\" (UID: \"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a\") " Dec 16 14:58:53 crc kubenswrapper[4775]: I1216 14:58:53.661227 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfg7j\" (UniqueName: \"kubernetes.io/projected/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a-kube-api-access-bfg7j\") pod \"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a\" (UID: \"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a\") " Dec 16 14:58:53 crc kubenswrapper[4775]: I1216 14:58:53.662163 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a-utilities" (OuterVolumeSpecName: "utilities") pod "7c243fb1-03dc-4ed6-9fa6-418e871d8b5a" (UID: "7c243fb1-03dc-4ed6-9fa6-418e871d8b5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:58:53 crc kubenswrapper[4775]: I1216 14:58:53.667463 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a-kube-api-access-bfg7j" (OuterVolumeSpecName: "kube-api-access-bfg7j") pod "7c243fb1-03dc-4ed6-9fa6-418e871d8b5a" (UID: "7c243fb1-03dc-4ed6-9fa6-418e871d8b5a"). InnerVolumeSpecName "kube-api-access-bfg7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:58:53 crc kubenswrapper[4775]: I1216 14:58:53.713487 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c243fb1-03dc-4ed6-9fa6-418e871d8b5a" (UID: "7c243fb1-03dc-4ed6-9fa6-418e871d8b5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:58:53 crc kubenswrapper[4775]: I1216 14:58:53.763498 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:53 crc kubenswrapper[4775]: I1216 14:58:53.763560 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:53 crc kubenswrapper[4775]: I1216 14:58:53.763574 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfg7j\" (UniqueName: \"kubernetes.io/projected/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a-kube-api-access-bfg7j\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:54 crc kubenswrapper[4775]: I1216 14:58:54.126167 4775 generic.go:334] "Generic (PLEG): container finished" podID="ef64597f-59f1-47be-afc6-aa95fb3c355c" containerID="ce1cc3a5386474f059988a815e4b5b16ec62d935de75bda749ca6188a5f3aded" exitCode=0 Dec 16 14:58:54 crc kubenswrapper[4775]: I1216 14:58:54.126252 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l8g4" event={"ID":"ef64597f-59f1-47be-afc6-aa95fb3c355c","Type":"ContainerDied","Data":"ce1cc3a5386474f059988a815e4b5b16ec62d935de75bda749ca6188a5f3aded"} Dec 16 14:58:54 crc kubenswrapper[4775]: I1216 14:58:54.129122 4775 generic.go:334] "Generic (PLEG): container finished" podID="ce7e6431-8250-485f-a202-a781b4b719cb" containerID="6b8c283f86b6b410fa53b20a535f16ff16d5d5ae8a2cd0d18217109236c682bf" exitCode=0 Dec 16 14:58:54 crc kubenswrapper[4775]: I1216 14:58:54.129196 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhm9d" event={"ID":"ce7e6431-8250-485f-a202-a781b4b719cb","Type":"ContainerDied","Data":"6b8c283f86b6b410fa53b20a535f16ff16d5d5ae8a2cd0d18217109236c682bf"} Dec 16 14:58:54 crc kubenswrapper[4775]: I1216 14:58:54.133294 4775 generic.go:334] "Generic (PLEG): container finished" podID="a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c" containerID="a39743942406deaeb967e7f7488d718ae6a509d0a63cfb0e2e88ebc313e4fe1e" exitCode=0 Dec 16 14:58:54 crc kubenswrapper[4775]: I1216 14:58:54.133351 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmqlw" event={"ID":"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c","Type":"ContainerDied","Data":"a39743942406deaeb967e7f7488d718ae6a509d0a63cfb0e2e88ebc313e4fe1e"} Dec 16 14:58:54 crc kubenswrapper[4775]: I1216 14:58:54.138705 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dm57c" event={"ID":"7c243fb1-03dc-4ed6-9fa6-418e871d8b5a","Type":"ContainerDied","Data":"c7f534594cc07ac71b01dc11d5471c73e62b89304ef88e6771db91777992e1d0"} Dec 16 14:58:54 crc kubenswrapper[4775]: I1216 14:58:54.138743 4775 scope.go:117] "RemoveContainer" containerID="cc7b3b1772643d1a152c3a976c9839c01fdc8880ebab651ae4ae843a85fc5169" Dec 16 14:58:54 crc kubenswrapper[4775]: I1216 14:58:54.138810 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dm57c" Dec 16 14:58:54 crc kubenswrapper[4775]: I1216 14:58:54.169312 4775 scope.go:117] "RemoveContainer" containerID="d741ffed290d091cdb6c50d9f7dcd2fc8e881279fd369afe3fac4578190d2430" Dec 16 14:58:54 crc kubenswrapper[4775]: I1216 14:58:54.197746 4775 scope.go:117] "RemoveContainer" containerID="b92742153839286dee554116f17e643181486890f70533950be08edf99d3afe4" Dec 16 14:58:54 crc kubenswrapper[4775]: I1216 14:58:54.239205 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dm57c"] Dec 16 14:58:54 crc kubenswrapper[4775]: I1216 14:58:54.245199 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dm57c"] Dec 16 14:58:55 crc kubenswrapper[4775]: I1216 14:58:55.147149 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l8g4" event={"ID":"ef64597f-59f1-47be-afc6-aa95fb3c355c","Type":"ContainerStarted","Data":"19b545bbd15e5da7e11ea2d5c1514c1735657be1aa7b1402052651578aff3cb8"} Dec 16 14:58:55 crc kubenswrapper[4775]: I1216 14:58:55.150380 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhm9d" event={"ID":"ce7e6431-8250-485f-a202-a781b4b719cb","Type":"ContainerStarted","Data":"7ab0943cc7e8d24c7588b3de6542fa7e3fe5f3de60c1b41529d11a00adbecbe6"} Dec 16 14:58:55 crc kubenswrapper[4775]: I1216 14:58:55.152736 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmqlw" event={"ID":"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c","Type":"ContainerStarted","Data":"f6c5854e99202fa132ca1e67f439fb8522a72c38f549490765ab31b6d625c1e7"} Dec 16 14:58:55 crc kubenswrapper[4775]: I1216 14:58:55.168455 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8l8g4" podStartSLOduration=3.490236119 podStartE2EDuration="1m37.168432617s" podCreationTimestamp="2025-12-16 14:57:18 +0000 UTC" firstStartedPulling="2025-12-16 14:57:20.982180722 +0000 UTC m=+165.933259645" lastFinishedPulling="2025-12-16 14:58:54.66037722 +0000 UTC m=+259.611456143" observedRunningTime="2025-12-16 14:58:55.168263191 +0000 UTC m=+260.119342114" watchObservedRunningTime="2025-12-16 14:58:55.168432617 +0000 UTC m=+260.119511540" Dec 16 14:58:55 crc kubenswrapper[4775]: I1216 14:58:55.185512 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xmqlw" podStartSLOduration=3.752444543 podStartE2EDuration="1m34.185489473s" podCreationTimestamp="2025-12-16 14:57:21 +0000 UTC" firstStartedPulling="2025-12-16 14:57:24.290283945 +0000 UTC m=+169.241362868" lastFinishedPulling="2025-12-16 14:58:54.723328875 +0000 UTC m=+259.674407798" observedRunningTime="2025-12-16 14:58:55.182353291 +0000 UTC m=+260.133432224" watchObservedRunningTime="2025-12-16 14:58:55.185489473 +0000 UTC m=+260.136568396" Dec 16 14:58:55 crc kubenswrapper[4775]: I1216 14:58:55.201766 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hhm9d" podStartSLOduration=3.594790336 podStartE2EDuration="1m37.201744573s" podCreationTimestamp="2025-12-16 14:57:18 +0000 UTC" firstStartedPulling="2025-12-16 14:57:20.961252806 +0000 UTC m=+165.912331739" lastFinishedPulling="2025-12-16 14:58:54.568207053 +0000 UTC m=+259.519285976" observedRunningTime="2025-12-16 14:58:55.199996267 +0000 UTC m=+260.151075210" watchObservedRunningTime="2025-12-16 14:58:55.201744573 +0000 UTC m=+260.152823496" Dec 16 14:58:55 crc kubenswrapper[4775]: I1216 14:58:55.344576 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c243fb1-03dc-4ed6-9fa6-418e871d8b5a" path="/var/lib/kubelet/pods/7c243fb1-03dc-4ed6-9fa6-418e871d8b5a/volumes" Dec 16 14:58:55 crc kubenswrapper[4775]: I1216 14:58:55.720484 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78c5984846-crs58"] Dec 16 14:58:55 crc kubenswrapper[4775]: I1216 14:58:55.720740 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-78c5984846-crs58" podUID="a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd" containerName="controller-manager" containerID="cri-o://9bd19c8730a64f6306eeac844833bf1699005876788a0a5db8867f28181005d1" gracePeriod=30 Dec 16 14:58:55 crc kubenswrapper[4775]: I1216 14:58:55.807209 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb"] Dec 16 14:58:55 crc kubenswrapper[4775]: I1216 14:58:55.807491 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" podUID="0e534235-ae6b-4a4e-9bb6-0d53d33f69bc" containerName="route-controller-manager" containerID="cri-o://b724adc6bfd4bb154cdf2112522aecd04afcf1f8b48e34d20df417ea23f22a25" gracePeriod=30 Dec 16 14:58:56 crc kubenswrapper[4775]: I1216 14:58:56.160521 4775 generic.go:334] "Generic (PLEG): container finished" podID="0e534235-ae6b-4a4e-9bb6-0d53d33f69bc" containerID="b724adc6bfd4bb154cdf2112522aecd04afcf1f8b48e34d20df417ea23f22a25" exitCode=0 Dec 16 14:58:56 crc kubenswrapper[4775]: I1216 14:58:56.160608 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" event={"ID":"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc","Type":"ContainerDied","Data":"b724adc6bfd4bb154cdf2112522aecd04afcf1f8b48e34d20df417ea23f22a25"} Dec 16 14:58:56 crc kubenswrapper[4775]: I1216 14:58:56.162664 4775 generic.go:334] "Generic (PLEG): container finished" podID="a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd" containerID="9bd19c8730a64f6306eeac844833bf1699005876788a0a5db8867f28181005d1" exitCode=0 Dec 16 14:58:56 crc kubenswrapper[4775]: I1216 14:58:56.162714 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c5984846-crs58" event={"ID":"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd","Type":"ContainerDied","Data":"9bd19c8730a64f6306eeac844833bf1699005876788a0a5db8867f28181005d1"} Dec 16 14:58:56 crc kubenswrapper[4775]: I1216 14:58:56.812221 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" Dec 16 14:58:56 crc kubenswrapper[4775]: I1216 14:58:56.888024 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:56 crc kubenswrapper[4775]: I1216 14:58:56.911840 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-serving-cert\") pod \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\" (UID: \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\") " Dec 16 14:58:56 crc kubenswrapper[4775]: I1216 14:58:56.911939 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j64cs\" (UniqueName: \"kubernetes.io/projected/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-kube-api-access-j64cs\") pod \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\" (UID: \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\") " Dec 16 14:58:56 crc kubenswrapper[4775]: I1216 14:58:56.912158 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-config\") pod \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\" (UID: \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\") " Dec 16 14:58:56 crc kubenswrapper[4775]: I1216 14:58:56.912192 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-client-ca\") pod \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\" (UID: \"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc\") " Dec 16 14:58:56 crc kubenswrapper[4775]: I1216 14:58:56.913675 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-client-ca" (OuterVolumeSpecName: "client-ca") pod "0e534235-ae6b-4a4e-9bb6-0d53d33f69bc" (UID: "0e534235-ae6b-4a4e-9bb6-0d53d33f69bc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:58:56 crc kubenswrapper[4775]: I1216 14:58:56.914574 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-config" (OuterVolumeSpecName: "config") pod "0e534235-ae6b-4a4e-9bb6-0d53d33f69bc" (UID: "0e534235-ae6b-4a4e-9bb6-0d53d33f69bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:58:56 crc kubenswrapper[4775]: I1216 14:58:56.920454 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-kube-api-access-j64cs" (OuterVolumeSpecName: "kube-api-access-j64cs") pod "0e534235-ae6b-4a4e-9bb6-0d53d33f69bc" (UID: "0e534235-ae6b-4a4e-9bb6-0d53d33f69bc"). InnerVolumeSpecName "kube-api-access-j64cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:58:56 crc kubenswrapper[4775]: I1216 14:58:56.928244 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0e534235-ae6b-4a4e-9bb6-0d53d33f69bc" (UID: "0e534235-ae6b-4a4e-9bb6-0d53d33f69bc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.012548 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk"] Dec 16 14:58:57 crc kubenswrapper[4775]: E1216 14:58:57.012798 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c243fb1-03dc-4ed6-9fa6-418e871d8b5a" containerName="registry-server" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.012813 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c243fb1-03dc-4ed6-9fa6-418e871d8b5a" containerName="registry-server" Dec 16 14:58:57 crc kubenswrapper[4775]: E1216 14:58:57.012827 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13d12fa6-4dd1-4786-9a14-51284cbf2f6d" containerName="pruner" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.012833 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="13d12fa6-4dd1-4786-9a14-51284cbf2f6d" containerName="pruner" Dec 16 14:58:57 crc kubenswrapper[4775]: E1216 14:58:57.012845 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd" containerName="controller-manager" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.012852 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd" containerName="controller-manager" Dec 16 14:58:57 crc kubenswrapper[4775]: E1216 14:58:57.012867 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c243fb1-03dc-4ed6-9fa6-418e871d8b5a" containerName="extract-content" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.012872 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c243fb1-03dc-4ed6-9fa6-418e871d8b5a" containerName="extract-content" Dec 16 14:58:57 crc kubenswrapper[4775]: E1216 14:58:57.012880 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c243fb1-03dc-4ed6-9fa6-418e871d8b5a" containerName="extract-utilities" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.012905 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c243fb1-03dc-4ed6-9fa6-418e871d8b5a" containerName="extract-utilities" Dec 16 14:58:57 crc kubenswrapper[4775]: E1216 14:58:57.012924 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e534235-ae6b-4a4e-9bb6-0d53d33f69bc" containerName="route-controller-manager" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.012932 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e534235-ae6b-4a4e-9bb6-0d53d33f69bc" containerName="route-controller-manager" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.013029 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="13d12fa6-4dd1-4786-9a14-51284cbf2f6d" containerName="pruner" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.013042 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd" containerName="controller-manager" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.013057 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e534235-ae6b-4a4e-9bb6-0d53d33f69bc" containerName="route-controller-manager" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.013065 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c243fb1-03dc-4ed6-9fa6-418e871d8b5a" containerName="registry-server" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.013434 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.014325 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-proxy-ca-bundles\") pod \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.014371 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-client-ca\") pod \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.014397 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-serving-cert\") pod \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.014434 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-config\") pod \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.014535 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f64k8\" (UniqueName: \"kubernetes.io/projected/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-kube-api-access-f64k8\") pod \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\" (UID: \"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd\") " Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.014829 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.014850 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.014863 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.014875 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j64cs\" (UniqueName: \"kubernetes.io/projected/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc-kube-api-access-j64cs\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.017630 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd" (UID: "a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.017670 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-config" (OuterVolumeSpecName: "config") pod "a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd" (UID: "a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.020814 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd" (UID: "a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.020911 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-kube-api-access-f64k8" (OuterVolumeSpecName: "kube-api-access-f64k8") pod "a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd" (UID: "a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd"). InnerVolumeSpecName "kube-api-access-f64k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.023153 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-client-ca" (OuterVolumeSpecName: "client-ca") pod "a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd" (UID: "a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.025128 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk"] Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.116795 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61dbff96-b237-4949-99e1-774bcec682f4-client-ca\") pod \"route-controller-manager-775d89cb8f-976qk\" (UID: \"61dbff96-b237-4949-99e1-774bcec682f4\") " pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.116946 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61dbff96-b237-4949-99e1-774bcec682f4-serving-cert\") pod \"route-controller-manager-775d89cb8f-976qk\" (UID: \"61dbff96-b237-4949-99e1-774bcec682f4\") " pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.116989 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsj62\" (UniqueName: \"kubernetes.io/projected/61dbff96-b237-4949-99e1-774bcec682f4-kube-api-access-zsj62\") pod \"route-controller-manager-775d89cb8f-976qk\" (UID: \"61dbff96-b237-4949-99e1-774bcec682f4\") " pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.117036 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61dbff96-b237-4949-99e1-774bcec682f4-config\") pod \"route-controller-manager-775d89cb8f-976qk\" (UID: \"61dbff96-b237-4949-99e1-774bcec682f4\") " pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.117187 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f64k8\" (UniqueName: \"kubernetes.io/projected/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-kube-api-access-f64k8\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.117213 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.117226 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.117239 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.117251 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.174860 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" event={"ID":"0e534235-ae6b-4a4e-9bb6-0d53d33f69bc","Type":"ContainerDied","Data":"17ec2aeaf1e185bdcec67cbfd4725766d0e1a2edba6cdd4998f791fe6bc744d1"} Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.174956 4775 scope.go:117] "RemoveContainer" containerID="b724adc6bfd4bb154cdf2112522aecd04afcf1f8b48e34d20df417ea23f22a25" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.174958 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.179396 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c5984846-crs58" event={"ID":"a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd","Type":"ContainerDied","Data":"b94d789436880dcb38324c36709cb5693dab769812b8d11b208861be4095e098"} Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.179477 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c5984846-crs58" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.202799 4775 scope.go:117] "RemoveContainer" containerID="9bd19c8730a64f6306eeac844833bf1699005876788a0a5db8867f28181005d1" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.220225 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61dbff96-b237-4949-99e1-774bcec682f4-serving-cert\") pod \"route-controller-manager-775d89cb8f-976qk\" (UID: \"61dbff96-b237-4949-99e1-774bcec682f4\") " pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.220295 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsj62\" (UniqueName: \"kubernetes.io/projected/61dbff96-b237-4949-99e1-774bcec682f4-kube-api-access-zsj62\") pod \"route-controller-manager-775d89cb8f-976qk\" (UID: \"61dbff96-b237-4949-99e1-774bcec682f4\") " pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.220370 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61dbff96-b237-4949-99e1-774bcec682f4-config\") pod \"route-controller-manager-775d89cb8f-976qk\" (UID: \"61dbff96-b237-4949-99e1-774bcec682f4\") " pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.220483 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61dbff96-b237-4949-99e1-774bcec682f4-client-ca\") pod \"route-controller-manager-775d89cb8f-976qk\" (UID: \"61dbff96-b237-4949-99e1-774bcec682f4\") " pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.221699 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61dbff96-b237-4949-99e1-774bcec682f4-client-ca\") pod \"route-controller-manager-775d89cb8f-976qk\" (UID: \"61dbff96-b237-4949-99e1-774bcec682f4\") " pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.222045 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61dbff96-b237-4949-99e1-774bcec682f4-config\") pod \"route-controller-manager-775d89cb8f-976qk\" (UID: \"61dbff96-b237-4949-99e1-774bcec682f4\") " pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.227307 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78c5984846-crs58"] Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.229128 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61dbff96-b237-4949-99e1-774bcec682f4-serving-cert\") pod \"route-controller-manager-775d89cb8f-976qk\" (UID: \"61dbff96-b237-4949-99e1-774bcec682f4\") " pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.234010 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-78c5984846-crs58"] Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.245353 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb"] Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.247113 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsj62\" (UniqueName: \"kubernetes.io/projected/61dbff96-b237-4949-99e1-774bcec682f4-kube-api-access-zsj62\") pod \"route-controller-manager-775d89cb8f-976qk\" (UID: \"61dbff96-b237-4949-99e1-774bcec682f4\") " pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.249371 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cc8fccf5d-jlmvb"] Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.348393 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e534235-ae6b-4a4e-9bb6-0d53d33f69bc" path="/var/lib/kubelet/pods/0e534235-ae6b-4a4e-9bb6-0d53d33f69bc/volumes" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.349380 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd" path="/var/lib/kubelet/pods/a8d6fb76-ab24-474f-9fdb-5c5893bd6bbd/volumes" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.358924 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" Dec 16 14:58:57 crc kubenswrapper[4775]: I1216 14:58:57.805616 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk"] Dec 16 14:58:58 crc kubenswrapper[4775]: I1216 14:58:58.193086 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" event={"ID":"61dbff96-b237-4949-99e1-774bcec682f4","Type":"ContainerStarted","Data":"92e524de76350cc104ceaf1c6a957f3f463b1cbf124ce7688a67da11d290c05b"} Dec 16 14:58:58 crc kubenswrapper[4775]: I1216 14:58:58.916505 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8l8g4" Dec 16 14:58:58 crc kubenswrapper[4775]: I1216 14:58:58.916930 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8l8g4" Dec 16 14:58:58 crc kubenswrapper[4775]: I1216 14:58:58.980224 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8l8g4" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.015185 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-767c66895f-dfn2j"] Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.016283 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.021369 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.023845 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.025411 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.025531 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.025566 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.027664 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.030475 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-767c66895f-dfn2j"] Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.032840 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.147876 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d009283a-9bbd-455c-b5c2-4ed6c3336b52-serving-cert\") pod \"controller-manager-767c66895f-dfn2j\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.148019 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d009283a-9bbd-455c-b5c2-4ed6c3336b52-client-ca\") pod \"controller-manager-767c66895f-dfn2j\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.148050 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d009283a-9bbd-455c-b5c2-4ed6c3336b52-proxy-ca-bundles\") pod \"controller-manager-767c66895f-dfn2j\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.148087 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d009283a-9bbd-455c-b5c2-4ed6c3336b52-config\") pod \"controller-manager-767c66895f-dfn2j\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.148125 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4ftw\" (UniqueName: \"kubernetes.io/projected/d009283a-9bbd-455c-b5c2-4ed6c3336b52-kube-api-access-v4ftw\") pod \"controller-manager-767c66895f-dfn2j\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.164646 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hhm9d" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.164709 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hhm9d" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.208432 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" event={"ID":"61dbff96-b237-4949-99e1-774bcec682f4","Type":"ContainerStarted","Data":"7eeb6757a6a2e10e762779d8e360b7b17a41e0a5fdf191620a2d7da708270806"} Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.208753 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.214117 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hhm9d" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.233415 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" podStartSLOduration=4.233393407 podStartE2EDuration="4.233393407s" podCreationTimestamp="2025-12-16 14:58:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:58:59.227461537 +0000 UTC m=+264.178540480" watchObservedRunningTime="2025-12-16 14:58:59.233393407 +0000 UTC m=+264.184472330" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.249738 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d009283a-9bbd-455c-b5c2-4ed6c3336b52-serving-cert\") pod \"controller-manager-767c66895f-dfn2j\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.249906 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d009283a-9bbd-455c-b5c2-4ed6c3336b52-client-ca\") pod \"controller-manager-767c66895f-dfn2j\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.249930 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d009283a-9bbd-455c-b5c2-4ed6c3336b52-proxy-ca-bundles\") pod \"controller-manager-767c66895f-dfn2j\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.249958 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d009283a-9bbd-455c-b5c2-4ed6c3336b52-config\") pod \"controller-manager-767c66895f-dfn2j\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.249986 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4ftw\" (UniqueName: \"kubernetes.io/projected/d009283a-9bbd-455c-b5c2-4ed6c3336b52-kube-api-access-v4ftw\") pod \"controller-manager-767c66895f-dfn2j\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.252131 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d009283a-9bbd-455c-b5c2-4ed6c3336b52-config\") pod \"controller-manager-767c66895f-dfn2j\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.252938 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d009283a-9bbd-455c-b5c2-4ed6c3336b52-proxy-ca-bundles\") pod \"controller-manager-767c66895f-dfn2j\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.253271 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d009283a-9bbd-455c-b5c2-4ed6c3336b52-client-ca\") pod \"controller-manager-767c66895f-dfn2j\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.259072 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d009283a-9bbd-455c-b5c2-4ed6c3336b52-serving-cert\") pod \"controller-manager-767c66895f-dfn2j\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.270261 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4ftw\" (UniqueName: \"kubernetes.io/projected/d009283a-9bbd-455c-b5c2-4ed6c3336b52-kube-api-access-v4ftw\") pod \"controller-manager-767c66895f-dfn2j\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.277592 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8l8g4" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.287524 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hhm9d" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.334769 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.628859 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hhm9d"] Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.641804 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" Dec 16 14:58:59 crc kubenswrapper[4775]: I1216 14:58:59.772335 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-767c66895f-dfn2j"] Dec 16 14:59:00 crc kubenswrapper[4775]: I1216 14:59:00.217036 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" event={"ID":"d009283a-9bbd-455c-b5c2-4ed6c3336b52","Type":"ContainerStarted","Data":"4847f85f823721dfe69bfd288f58c3206e85c9da19d07125de3aaaa8e9262c46"} Dec 16 14:59:01 crc kubenswrapper[4775]: I1216 14:59:01.168355 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rhnlc" Dec 16 14:59:01 crc kubenswrapper[4775]: I1216 14:59:01.224580 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hhm9d" podUID="ce7e6431-8250-485f-a202-a781b4b719cb" containerName="registry-server" containerID="cri-o://7ab0943cc7e8d24c7588b3de6542fa7e3fe5f3de60c1b41529d11a00adbecbe6" gracePeriod=2 Dec 16 14:59:01 crc kubenswrapper[4775]: I1216 14:59:01.737212 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wrkz2" Dec 16 14:59:01 crc kubenswrapper[4775]: I1216 14:59:01.778881 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wrkz2" Dec 16 14:59:02 crc kubenswrapper[4775]: I1216 14:59:02.156651 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xmqlw" Dec 16 14:59:02 crc kubenswrapper[4775]: I1216 14:59:02.156731 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xmqlw" Dec 16 14:59:02 crc kubenswrapper[4775]: I1216 14:59:02.208291 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xmqlw" Dec 16 14:59:02 crc kubenswrapper[4775]: I1216 14:59:02.264764 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xmqlw" Dec 16 14:59:03 crc kubenswrapper[4775]: I1216 14:59:03.120784 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ms9lk"] Dec 16 14:59:03 crc kubenswrapper[4775]: I1216 14:59:03.236338 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" event={"ID":"d009283a-9bbd-455c-b5c2-4ed6c3336b52","Type":"ContainerStarted","Data":"71a5256bd60c43445f1d19512f8df0dd1cce666a9a3e4a4e6f5acaff757aad4b"} Dec 16 14:59:03 crc kubenswrapper[4775]: I1216 14:59:03.422171 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhnlc"] Dec 16 14:59:03 crc kubenswrapper[4775]: I1216 14:59:03.422465 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rhnlc" podUID="d5046921-0655-4cef-b310-018ed7ea22c4" containerName="registry-server" containerID="cri-o://e6c1f9c2e746ede5cb2bbd373c0b0dd407da6e3c8e55ba8c14cb6471f07c4203" gracePeriod=2 Dec 16 14:59:03 crc kubenswrapper[4775]: I1216 14:59:03.782262 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rhnlc" Dec 16 14:59:03 crc kubenswrapper[4775]: I1216 14:59:03.916716 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5046921-0655-4cef-b310-018ed7ea22c4-catalog-content\") pod \"d5046921-0655-4cef-b310-018ed7ea22c4\" (UID: \"d5046921-0655-4cef-b310-018ed7ea22c4\") " Dec 16 14:59:03 crc kubenswrapper[4775]: I1216 14:59:03.916830 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q287q\" (UniqueName: \"kubernetes.io/projected/d5046921-0655-4cef-b310-018ed7ea22c4-kube-api-access-q287q\") pod \"d5046921-0655-4cef-b310-018ed7ea22c4\" (UID: \"d5046921-0655-4cef-b310-018ed7ea22c4\") " Dec 16 14:59:03 crc kubenswrapper[4775]: I1216 14:59:03.916850 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5046921-0655-4cef-b310-018ed7ea22c4-utilities\") pod \"d5046921-0655-4cef-b310-018ed7ea22c4\" (UID: \"d5046921-0655-4cef-b310-018ed7ea22c4\") " Dec 16 14:59:03 crc kubenswrapper[4775]: I1216 14:59:03.917658 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5046921-0655-4cef-b310-018ed7ea22c4-utilities" (OuterVolumeSpecName: "utilities") pod "d5046921-0655-4cef-b310-018ed7ea22c4" (UID: "d5046921-0655-4cef-b310-018ed7ea22c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:59:03 crc kubenswrapper[4775]: I1216 14:59:03.922548 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5046921-0655-4cef-b310-018ed7ea22c4-kube-api-access-q287q" (OuterVolumeSpecName: "kube-api-access-q287q") pod "d5046921-0655-4cef-b310-018ed7ea22c4" (UID: "d5046921-0655-4cef-b310-018ed7ea22c4"). InnerVolumeSpecName "kube-api-access-q287q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:59:03 crc kubenswrapper[4775]: I1216 14:59:03.938097 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5046921-0655-4cef-b310-018ed7ea22c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5046921-0655-4cef-b310-018ed7ea22c4" (UID: "d5046921-0655-4cef-b310-018ed7ea22c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:59:03 crc kubenswrapper[4775]: I1216 14:59:03.982480 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhm9d" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.020258 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4w5x\" (UniqueName: \"kubernetes.io/projected/ce7e6431-8250-485f-a202-a781b4b719cb-kube-api-access-t4w5x\") pod \"ce7e6431-8250-485f-a202-a781b4b719cb\" (UID: \"ce7e6431-8250-485f-a202-a781b4b719cb\") " Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.020403 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce7e6431-8250-485f-a202-a781b4b719cb-catalog-content\") pod \"ce7e6431-8250-485f-a202-a781b4b719cb\" (UID: \"ce7e6431-8250-485f-a202-a781b4b719cb\") " Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.020489 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce7e6431-8250-485f-a202-a781b4b719cb-utilities\") pod \"ce7e6431-8250-485f-a202-a781b4b719cb\" (UID: \"ce7e6431-8250-485f-a202-a781b4b719cb\") " Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.021145 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5046921-0655-4cef-b310-018ed7ea22c4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.021174 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q287q\" (UniqueName: \"kubernetes.io/projected/d5046921-0655-4cef-b310-018ed7ea22c4-kube-api-access-q287q\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.021196 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5046921-0655-4cef-b310-018ed7ea22c4-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.025043 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce7e6431-8250-485f-a202-a781b4b719cb-utilities" (OuterVolumeSpecName: "utilities") pod "ce7e6431-8250-485f-a202-a781b4b719cb" (UID: "ce7e6431-8250-485f-a202-a781b4b719cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.035139 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce7e6431-8250-485f-a202-a781b4b719cb-kube-api-access-t4w5x" (OuterVolumeSpecName: "kube-api-access-t4w5x") pod "ce7e6431-8250-485f-a202-a781b4b719cb" (UID: "ce7e6431-8250-485f-a202-a781b4b719cb"). InnerVolumeSpecName "kube-api-access-t4w5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.085518 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce7e6431-8250-485f-a202-a781b4b719cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce7e6431-8250-485f-a202-a781b4b719cb" (UID: "ce7e6431-8250-485f-a202-a781b4b719cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.123995 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce7e6431-8250-485f-a202-a781b4b719cb-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.124065 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4w5x\" (UniqueName: \"kubernetes.io/projected/ce7e6431-8250-485f-a202-a781b4b719cb-kube-api-access-t4w5x\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.124082 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce7e6431-8250-485f-a202-a781b4b719cb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.244086 4775 generic.go:334] "Generic (PLEG): container finished" podID="ce7e6431-8250-485f-a202-a781b4b719cb" containerID="7ab0943cc7e8d24c7588b3de6542fa7e3fe5f3de60c1b41529d11a00adbecbe6" exitCode=0 Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.244149 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhm9d" event={"ID":"ce7e6431-8250-485f-a202-a781b4b719cb","Type":"ContainerDied","Data":"7ab0943cc7e8d24c7588b3de6542fa7e3fe5f3de60c1b41529d11a00adbecbe6"} Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.244531 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhm9d" event={"ID":"ce7e6431-8250-485f-a202-a781b4b719cb","Type":"ContainerDied","Data":"ab2c4a6c19f4aeb32307e11321422c79b48aa0ead871bb8e849e2442fb4b3aa8"} Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.244552 4775 scope.go:117] "RemoveContainer" containerID="7ab0943cc7e8d24c7588b3de6542fa7e3fe5f3de60c1b41529d11a00adbecbe6" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.244197 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhm9d" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.249692 4775 generic.go:334] "Generic (PLEG): container finished" podID="d5046921-0655-4cef-b310-018ed7ea22c4" containerID="e6c1f9c2e746ede5cb2bbd373c0b0dd407da6e3c8e55ba8c14cb6471f07c4203" exitCode=0 Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.249726 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rhnlc" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.249739 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhnlc" event={"ID":"d5046921-0655-4cef-b310-018ed7ea22c4","Type":"ContainerDied","Data":"e6c1f9c2e746ede5cb2bbd373c0b0dd407da6e3c8e55ba8c14cb6471f07c4203"} Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.250016 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhnlc" event={"ID":"d5046921-0655-4cef-b310-018ed7ea22c4","Type":"ContainerDied","Data":"8d03d66851101c7e1d4291882e5498e4c3bf60ac28b176277e427b376df4a0ff"} Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.250468 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.260228 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.276324 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" podStartSLOduration=9.276305263 podStartE2EDuration="9.276305263s" podCreationTimestamp="2025-12-16 14:58:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:04.274187385 +0000 UTC m=+269.225266328" watchObservedRunningTime="2025-12-16 14:59:04.276305263 +0000 UTC m=+269.227384186" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.283283 4775 scope.go:117] "RemoveContainer" containerID="6b8c283f86b6b410fa53b20a535f16ff16d5d5ae8a2cd0d18217109236c682bf" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.328248 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.328295 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.329454 4775 scope.go:117] "RemoveContainer" containerID="856fbd9b77671619e2fe79f71ff0f8d3883f9edf6d1a0bd11b1af97c5a7a900b" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.331089 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.331205 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhnlc"] Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.331357 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.337864 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhnlc"] Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.344450 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.347533 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.348645 4775 scope.go:117] "RemoveContainer" containerID="7ab0943cc7e8d24c7588b3de6542fa7e3fe5f3de60c1b41529d11a00adbecbe6" Dec 16 14:59:04 crc kubenswrapper[4775]: E1216 14:59:04.349356 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ab0943cc7e8d24c7588b3de6542fa7e3fe5f3de60c1b41529d11a00adbecbe6\": container with ID starting with 7ab0943cc7e8d24c7588b3de6542fa7e3fe5f3de60c1b41529d11a00adbecbe6 not found: ID does not exist" containerID="7ab0943cc7e8d24c7588b3de6542fa7e3fe5f3de60c1b41529d11a00adbecbe6" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.349496 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ab0943cc7e8d24c7588b3de6542fa7e3fe5f3de60c1b41529d11a00adbecbe6"} err="failed to get container status \"7ab0943cc7e8d24c7588b3de6542fa7e3fe5f3de60c1b41529d11a00adbecbe6\": rpc error: code = NotFound desc = could not find container \"7ab0943cc7e8d24c7588b3de6542fa7e3fe5f3de60c1b41529d11a00adbecbe6\": container with ID starting with 7ab0943cc7e8d24c7588b3de6542fa7e3fe5f3de60c1b41529d11a00adbecbe6 not found: ID does not exist" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.349622 4775 scope.go:117] "RemoveContainer" containerID="6b8c283f86b6b410fa53b20a535f16ff16d5d5ae8a2cd0d18217109236c682bf" Dec 16 14:59:04 crc kubenswrapper[4775]: E1216 14:59:04.350074 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b8c283f86b6b410fa53b20a535f16ff16d5d5ae8a2cd0d18217109236c682bf\": container with ID starting with 6b8c283f86b6b410fa53b20a535f16ff16d5d5ae8a2cd0d18217109236c682bf not found: ID does not exist" containerID="6b8c283f86b6b410fa53b20a535f16ff16d5d5ae8a2cd0d18217109236c682bf" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.350185 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b8c283f86b6b410fa53b20a535f16ff16d5d5ae8a2cd0d18217109236c682bf"} err="failed to get container status \"6b8c283f86b6b410fa53b20a535f16ff16d5d5ae8a2cd0d18217109236c682bf\": rpc error: code = NotFound desc = could not find container \"6b8c283f86b6b410fa53b20a535f16ff16d5d5ae8a2cd0d18217109236c682bf\": container with ID starting with 6b8c283f86b6b410fa53b20a535f16ff16d5d5ae8a2cd0d18217109236c682bf not found: ID does not exist" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.350291 4775 scope.go:117] "RemoveContainer" containerID="856fbd9b77671619e2fe79f71ff0f8d3883f9edf6d1a0bd11b1af97c5a7a900b" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.350222 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hhm9d"] Dec 16 14:59:04 crc kubenswrapper[4775]: E1216 14:59:04.350736 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"856fbd9b77671619e2fe79f71ff0f8d3883f9edf6d1a0bd11b1af97c5a7a900b\": container with ID starting with 856fbd9b77671619e2fe79f71ff0f8d3883f9edf6d1a0bd11b1af97c5a7a900b not found: ID does not exist" containerID="856fbd9b77671619e2fe79f71ff0f8d3883f9edf6d1a0bd11b1af97c5a7a900b" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.350839 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"856fbd9b77671619e2fe79f71ff0f8d3883f9edf6d1a0bd11b1af97c5a7a900b"} err="failed to get container status \"856fbd9b77671619e2fe79f71ff0f8d3883f9edf6d1a0bd11b1af97c5a7a900b\": rpc error: code = NotFound desc = could not find container \"856fbd9b77671619e2fe79f71ff0f8d3883f9edf6d1a0bd11b1af97c5a7a900b\": container with ID starting with 856fbd9b77671619e2fe79f71ff0f8d3883f9edf6d1a0bd11b1af97c5a7a900b not found: ID does not exist" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.350945 4775 scope.go:117] "RemoveContainer" containerID="e6c1f9c2e746ede5cb2bbd373c0b0dd407da6e3c8e55ba8c14cb6471f07c4203" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.352502 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.357008 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hhm9d"] Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.398014 4775 scope.go:117] "RemoveContainer" containerID="d918aa1984ca6e854a58b04cabc054886a8c4244a4e5bc5e9cb953423d39de4d" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.413016 4775 scope.go:117] "RemoveContainer" containerID="6bfa6100877012029ae4814c41e16486361c525d2e6093f5e13570459a918943" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.429991 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.430094 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.432037 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.435776 4775 scope.go:117] "RemoveContainer" containerID="e6c1f9c2e746ede5cb2bbd373c0b0dd407da6e3c8e55ba8c14cb6471f07c4203" Dec 16 14:59:04 crc kubenswrapper[4775]: E1216 14:59:04.436967 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6c1f9c2e746ede5cb2bbd373c0b0dd407da6e3c8e55ba8c14cb6471f07c4203\": container with ID starting with e6c1f9c2e746ede5cb2bbd373c0b0dd407da6e3c8e55ba8c14cb6471f07c4203 not found: ID does not exist" containerID="e6c1f9c2e746ede5cb2bbd373c0b0dd407da6e3c8e55ba8c14cb6471f07c4203" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.436993 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6c1f9c2e746ede5cb2bbd373c0b0dd407da6e3c8e55ba8c14cb6471f07c4203"} err="failed to get container status \"e6c1f9c2e746ede5cb2bbd373c0b0dd407da6e3c8e55ba8c14cb6471f07c4203\": rpc error: code = NotFound desc = could not find container \"e6c1f9c2e746ede5cb2bbd373c0b0dd407da6e3c8e55ba8c14cb6471f07c4203\": container with ID starting with e6c1f9c2e746ede5cb2bbd373c0b0dd407da6e3c8e55ba8c14cb6471f07c4203 not found: ID does not exist" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.437021 4775 scope.go:117] "RemoveContainer" containerID="d918aa1984ca6e854a58b04cabc054886a8c4244a4e5bc5e9cb953423d39de4d" Dec 16 14:59:04 crc kubenswrapper[4775]: E1216 14:59:04.437483 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d918aa1984ca6e854a58b04cabc054886a8c4244a4e5bc5e9cb953423d39de4d\": container with ID starting with d918aa1984ca6e854a58b04cabc054886a8c4244a4e5bc5e9cb953423d39de4d not found: ID does not exist" containerID="d918aa1984ca6e854a58b04cabc054886a8c4244a4e5bc5e9cb953423d39de4d" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.437548 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d918aa1984ca6e854a58b04cabc054886a8c4244a4e5bc5e9cb953423d39de4d"} err="failed to get container status \"d918aa1984ca6e854a58b04cabc054886a8c4244a4e5bc5e9cb953423d39de4d\": rpc error: code = NotFound desc = could not find container \"d918aa1984ca6e854a58b04cabc054886a8c4244a4e5bc5e9cb953423d39de4d\": container with ID starting with d918aa1984ca6e854a58b04cabc054886a8c4244a4e5bc5e9cb953423d39de4d not found: ID does not exist" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.437571 4775 scope.go:117] "RemoveContainer" containerID="6bfa6100877012029ae4814c41e16486361c525d2e6093f5e13570459a918943" Dec 16 14:59:04 crc kubenswrapper[4775]: E1216 14:59:04.437860 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bfa6100877012029ae4814c41e16486361c525d2e6093f5e13570459a918943\": container with ID starting with 6bfa6100877012029ae4814c41e16486361c525d2e6093f5e13570459a918943 not found: ID does not exist" containerID="6bfa6100877012029ae4814c41e16486361c525d2e6093f5e13570459a918943" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.437879 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bfa6100877012029ae4814c41e16486361c525d2e6093f5e13570459a918943"} err="failed to get container status \"6bfa6100877012029ae4814c41e16486361c525d2e6093f5e13570459a918943\": rpc error: code = NotFound desc = could not find container \"6bfa6100877012029ae4814c41e16486361c525d2e6093f5e13570459a918943\": container with ID starting with 6bfa6100877012029ae4814c41e16486361c525d2e6093f5e13570459a918943 not found: ID does not exist" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.443317 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.457941 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.458387 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.560194 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:59:04 crc kubenswrapper[4775]: I1216 14:59:04.571561 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:59:04 crc kubenswrapper[4775]: W1216 14:59:04.761057 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-9328e98c7c84ed9443f4a2f7907674091ef229c2da71423fbf92892b231a9d85 WatchSource:0}: Error finding container 9328e98c7c84ed9443f4a2f7907674091ef229c2da71423fbf92892b231a9d85: Status 404 returned error can't find the container with id 9328e98c7c84ed9443f4a2f7907674091ef229c2da71423fbf92892b231a9d85 Dec 16 14:59:04 crc kubenswrapper[4775]: W1216 14:59:04.981359 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-4e8c61064ffe222e929c80b6cd7bd9f2d21eaa397d07f4cda814eb65319cf66b WatchSource:0}: Error finding container 4e8c61064ffe222e929c80b6cd7bd9f2d21eaa397d07f4cda814eb65319cf66b: Status 404 returned error can't find the container with id 4e8c61064ffe222e929c80b6cd7bd9f2d21eaa397d07f4cda814eb65319cf66b Dec 16 14:59:05 crc kubenswrapper[4775]: W1216 14:59:05.037112 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-102cf34d8250d04b58cfc97c3ffb1bdf0a67f84a16a924d458680f95ab9a2f6e WatchSource:0}: Error finding container 102cf34d8250d04b58cfc97c3ffb1bdf0a67f84a16a924d458680f95ab9a2f6e: Status 404 returned error can't find the container with id 102cf34d8250d04b58cfc97c3ffb1bdf0a67f84a16a924d458680f95ab9a2f6e Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.229094 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xmqlw"] Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.229420 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xmqlw" podUID="a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c" containerName="registry-server" containerID="cri-o://f6c5854e99202fa132ca1e67f439fb8522a72c38f549490765ab31b6d625c1e7" gracePeriod=2 Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.258624 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"78ec8846668126913fe1d9b86ca32f01e3ed98ddd4b3a76eb01c2561b32a1122"} Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.258677 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9328e98c7c84ed9443f4a2f7907674091ef229c2da71423fbf92892b231a9d85"} Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.260671 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"41245ab1aaf31e211a166568d542e4e5579090a4d481a1b73989a93168a4d748"} Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.260707 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"102cf34d8250d04b58cfc97c3ffb1bdf0a67f84a16a924d458680f95ab9a2f6e"} Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.263054 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"37e933261e632da6a1ebdf15b0f85fdb6882450110f108e7a7e16461913921f2"} Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.263080 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4e8c61064ffe222e929c80b6cd7bd9f2d21eaa397d07f4cda814eb65319cf66b"} Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.263602 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.352725 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce7e6431-8250-485f-a202-a781b4b719cb" path="/var/lib/kubelet/pods/ce7e6431-8250-485f-a202-a781b4b719cb/volumes" Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.353705 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5046921-0655-4cef-b310-018ed7ea22c4" path="/var/lib/kubelet/pods/d5046921-0655-4cef-b310-018ed7ea22c4/volumes" Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.682348 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmqlw" Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.773174 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c-utilities\") pod \"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c\" (UID: \"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c\") " Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.773347 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c-catalog-content\") pod \"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c\" (UID: \"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c\") " Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.773474 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjfff\" (UniqueName: \"kubernetes.io/projected/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c-kube-api-access-bjfff\") pod \"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c\" (UID: \"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c\") " Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.774270 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c-utilities" (OuterVolumeSpecName: "utilities") pod "a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c" (UID: "a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.781145 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c-kube-api-access-bjfff" (OuterVolumeSpecName: "kube-api-access-bjfff") pod "a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c" (UID: "a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c"). InnerVolumeSpecName "kube-api-access-bjfff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.874674 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjfff\" (UniqueName: \"kubernetes.io/projected/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c-kube-api-access-bjfff\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.875056 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.926173 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c" (UID: "a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:59:05 crc kubenswrapper[4775]: I1216 14:59:05.976499 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:06 crc kubenswrapper[4775]: I1216 14:59:06.283782 4775 generic.go:334] "Generic (PLEG): container finished" podID="a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c" containerID="f6c5854e99202fa132ca1e67f439fb8522a72c38f549490765ab31b6d625c1e7" exitCode=0 Dec 16 14:59:06 crc kubenswrapper[4775]: I1216 14:59:06.283875 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmqlw" event={"ID":"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c","Type":"ContainerDied","Data":"f6c5854e99202fa132ca1e67f439fb8522a72c38f549490765ab31b6d625c1e7"} Dec 16 14:59:06 crc kubenswrapper[4775]: I1216 14:59:06.283875 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmqlw" Dec 16 14:59:06 crc kubenswrapper[4775]: I1216 14:59:06.284203 4775 scope.go:117] "RemoveContainer" containerID="f6c5854e99202fa132ca1e67f439fb8522a72c38f549490765ab31b6d625c1e7" Dec 16 14:59:06 crc kubenswrapper[4775]: I1216 14:59:06.284178 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmqlw" event={"ID":"a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c","Type":"ContainerDied","Data":"d62251ce7352026a2de91e2ca25f72131d43f698370d18e84c66aedc49dfc5f1"} Dec 16 14:59:06 crc kubenswrapper[4775]: I1216 14:59:06.302499 4775 scope.go:117] "RemoveContainer" containerID="a39743942406deaeb967e7f7488d718ae6a509d0a63cfb0e2e88ebc313e4fe1e" Dec 16 14:59:06 crc kubenswrapper[4775]: I1216 14:59:06.327687 4775 scope.go:117] "RemoveContainer" containerID="32c4b76138b7d1251a29272c2f8ab0ce4cba04bd48729d29e94a8338acf4f447" Dec 16 14:59:06 crc kubenswrapper[4775]: I1216 14:59:06.332275 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xmqlw"] Dec 16 14:59:06 crc kubenswrapper[4775]: I1216 14:59:06.336981 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xmqlw"] Dec 16 14:59:06 crc kubenswrapper[4775]: I1216 14:59:06.347239 4775 scope.go:117] "RemoveContainer" containerID="f6c5854e99202fa132ca1e67f439fb8522a72c38f549490765ab31b6d625c1e7" Dec 16 14:59:06 crc kubenswrapper[4775]: E1216 14:59:06.347739 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6c5854e99202fa132ca1e67f439fb8522a72c38f549490765ab31b6d625c1e7\": container with ID starting with f6c5854e99202fa132ca1e67f439fb8522a72c38f549490765ab31b6d625c1e7 not found: ID does not exist" containerID="f6c5854e99202fa132ca1e67f439fb8522a72c38f549490765ab31b6d625c1e7" Dec 16 14:59:06 crc kubenswrapper[4775]: I1216 14:59:06.347921 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6c5854e99202fa132ca1e67f439fb8522a72c38f549490765ab31b6d625c1e7"} err="failed to get container status \"f6c5854e99202fa132ca1e67f439fb8522a72c38f549490765ab31b6d625c1e7\": rpc error: code = NotFound desc = could not find container \"f6c5854e99202fa132ca1e67f439fb8522a72c38f549490765ab31b6d625c1e7\": container with ID starting with f6c5854e99202fa132ca1e67f439fb8522a72c38f549490765ab31b6d625c1e7 not found: ID does not exist" Dec 16 14:59:06 crc kubenswrapper[4775]: I1216 14:59:06.348029 4775 scope.go:117] "RemoveContainer" containerID="a39743942406deaeb967e7f7488d718ae6a509d0a63cfb0e2e88ebc313e4fe1e" Dec 16 14:59:06 crc kubenswrapper[4775]: E1216 14:59:06.348427 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a39743942406deaeb967e7f7488d718ae6a509d0a63cfb0e2e88ebc313e4fe1e\": container with ID starting with a39743942406deaeb967e7f7488d718ae6a509d0a63cfb0e2e88ebc313e4fe1e not found: ID does not exist" containerID="a39743942406deaeb967e7f7488d718ae6a509d0a63cfb0e2e88ebc313e4fe1e" Dec 16 14:59:06 crc kubenswrapper[4775]: I1216 14:59:06.348498 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a39743942406deaeb967e7f7488d718ae6a509d0a63cfb0e2e88ebc313e4fe1e"} err="failed to get container status \"a39743942406deaeb967e7f7488d718ae6a509d0a63cfb0e2e88ebc313e4fe1e\": rpc error: code = NotFound desc = could not find container \"a39743942406deaeb967e7f7488d718ae6a509d0a63cfb0e2e88ebc313e4fe1e\": container with ID starting with a39743942406deaeb967e7f7488d718ae6a509d0a63cfb0e2e88ebc313e4fe1e not found: ID does not exist" Dec 16 14:59:06 crc kubenswrapper[4775]: I1216 14:59:06.348553 4775 scope.go:117] "RemoveContainer" containerID="32c4b76138b7d1251a29272c2f8ab0ce4cba04bd48729d29e94a8338acf4f447" Dec 16 14:59:06 crc kubenswrapper[4775]: E1216 14:59:06.349019 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32c4b76138b7d1251a29272c2f8ab0ce4cba04bd48729d29e94a8338acf4f447\": container with ID starting with 32c4b76138b7d1251a29272c2f8ab0ce4cba04bd48729d29e94a8338acf4f447 not found: ID does not exist" containerID="32c4b76138b7d1251a29272c2f8ab0ce4cba04bd48729d29e94a8338acf4f447" Dec 16 14:59:06 crc kubenswrapper[4775]: I1216 14:59:06.349066 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32c4b76138b7d1251a29272c2f8ab0ce4cba04bd48729d29e94a8338acf4f447"} err="failed to get container status \"32c4b76138b7d1251a29272c2f8ab0ce4cba04bd48729d29e94a8338acf4f447\": rpc error: code = NotFound desc = could not find container \"32c4b76138b7d1251a29272c2f8ab0ce4cba04bd48729d29e94a8338acf4f447\": container with ID starting with 32c4b76138b7d1251a29272c2f8ab0ce4cba04bd48729d29e94a8338acf4f447 not found: ID does not exist" Dec 16 14:59:07 crc kubenswrapper[4775]: E1216 14:59:07.229664 4775 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.231618 4775 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 14:59:07 crc kubenswrapper[4775]: E1216 14:59:07.232090 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c" containerName="registry-server" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.232215 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c" containerName="registry-server" Dec 16 14:59:07 crc kubenswrapper[4775]: E1216 14:59:07.232305 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce7e6431-8250-485f-a202-a781b4b719cb" containerName="extract-utilities" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.232407 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7e6431-8250-485f-a202-a781b4b719cb" containerName="extract-utilities" Dec 16 14:59:07 crc kubenswrapper[4775]: E1216 14:59:07.232529 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce7e6431-8250-485f-a202-a781b4b719cb" containerName="extract-content" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.232657 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7e6431-8250-485f-a202-a781b4b719cb" containerName="extract-content" Dec 16 14:59:07 crc kubenswrapper[4775]: E1216 14:59:07.232758 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c" containerName="extract-utilities" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.232842 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c" containerName="extract-utilities" Dec 16 14:59:07 crc kubenswrapper[4775]: E1216 14:59:07.232967 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5046921-0655-4cef-b310-018ed7ea22c4" containerName="registry-server" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.233078 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5046921-0655-4cef-b310-018ed7ea22c4" containerName="registry-server" Dec 16 14:59:07 crc kubenswrapper[4775]: E1216 14:59:07.233203 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5046921-0655-4cef-b310-018ed7ea22c4" containerName="extract-content" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.233318 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5046921-0655-4cef-b310-018ed7ea22c4" containerName="extract-content" Dec 16 14:59:07 crc kubenswrapper[4775]: E1216 14:59:07.233614 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5046921-0655-4cef-b310-018ed7ea22c4" containerName="extract-utilities" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.233697 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5046921-0655-4cef-b310-018ed7ea22c4" containerName="extract-utilities" Dec 16 14:59:07 crc kubenswrapper[4775]: E1216 14:59:07.233786 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c" containerName="extract-content" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.233869 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c" containerName="extract-content" Dec 16 14:59:07 crc kubenswrapper[4775]: E1216 14:59:07.233989 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce7e6431-8250-485f-a202-a781b4b719cb" containerName="registry-server" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.234075 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7e6431-8250-485f-a202-a781b4b719cb" containerName="registry-server" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.234322 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c" containerName="registry-server" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.234414 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5046921-0655-4cef-b310-018ed7ea22c4" containerName="registry-server" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.234503 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce7e6431-8250-485f-a202-a781b4b719cb" containerName="registry-server" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.235102 4775 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.235290 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.235364 4775 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.235713 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a" gracePeriod=15 Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.235876 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138" gracePeriod=15 Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.235720 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857" gracePeriod=15 Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.235774 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e" gracePeriod=15 Dec 16 14:59:07 crc kubenswrapper[4775]: E1216 14:59:07.236100 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.236121 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 16 14:59:07 crc kubenswrapper[4775]: E1216 14:59:07.236132 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.236139 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 14:59:07 crc kubenswrapper[4775]: E1216 14:59:07.236147 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.236154 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.235786 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e" gracePeriod=15 Dec 16 14:59:07 crc kubenswrapper[4775]: E1216 14:59:07.236169 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.236228 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 16 14:59:07 crc kubenswrapper[4775]: E1216 14:59:07.236248 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.236258 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 16 14:59:07 crc kubenswrapper[4775]: E1216 14:59:07.236266 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.236273 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 16 14:59:07 crc kubenswrapper[4775]: E1216 14:59:07.236286 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.236294 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.236435 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.236451 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.236467 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.236476 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.236483 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.236491 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.247769 4775 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.271433 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.345497 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c" path="/var/lib/kubelet/pods/a6dc29a9-5b48-4bc1-b223-f3b327a7bf1c/volumes" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.398360 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.398423 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.398443 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.398473 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.398498 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.398514 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.398551 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.398577 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.507549 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.507631 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.507675 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.507702 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.507729 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.507774 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.507801 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.507828 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.507865 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.507876 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.507935 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.508015 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.507920 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.508058 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.508158 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.508269 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: I1216 14:59:07.568662 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 14:59:07 crc kubenswrapper[4775]: W1216 14:59:07.594026 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-6b24b3f25cb98a11a5fb738aaeab1556cf06ab3314b966849c92ae1776d0aa35 WatchSource:0}: Error finding container 6b24b3f25cb98a11a5fb738aaeab1556cf06ab3314b966849c92ae1776d0aa35: Status 404 returned error can't find the container with id 6b24b3f25cb98a11a5fb738aaeab1556cf06ab3314b966849c92ae1776d0aa35 Dec 16 14:59:07 crc kubenswrapper[4775]: E1216 14:59:07.598461 4775 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1881ba19806e7f38 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-16 14:59:07.597619 +0000 UTC m=+272.548697913,LastTimestamp:2025-12-16 14:59:07.597619 +0000 UTC m=+272.548697913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 16 14:59:08 crc kubenswrapper[4775]: E1216 14:59:08.277564 4775 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1881ba19806e7f38 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-16 14:59:07.597619 +0000 UTC m=+272.548697913,LastTimestamp:2025-12-16 14:59:07.597619 +0000 UTC m=+272.548697913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 16 14:59:08 crc kubenswrapper[4775]: I1216 14:59:08.299766 4775 generic.go:334] "Generic (PLEG): container finished" podID="68e5173d-4139-4745-bbb1-a20286fbf0f3" containerID="7f6aa3415735c8934af5a042d801a60e84b699b4a7198703f69fd8c83e930c04" exitCode=0 Dec 16 14:59:08 crc kubenswrapper[4775]: I1216 14:59:08.299847 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"68e5173d-4139-4745-bbb1-a20286fbf0f3","Type":"ContainerDied","Data":"7f6aa3415735c8934af5a042d801a60e84b699b4a7198703f69fd8c83e930c04"} Dec 16 14:59:08 crc kubenswrapper[4775]: I1216 14:59:08.300596 4775 status_manager.go:851] "Failed to get status for pod" podUID="68e5173d-4139-4745-bbb1-a20286fbf0f3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:08 crc kubenswrapper[4775]: I1216 14:59:08.301110 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:08 crc kubenswrapper[4775]: I1216 14:59:08.302087 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 16 14:59:08 crc kubenswrapper[4775]: I1216 14:59:08.303349 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 14:59:08 crc kubenswrapper[4775]: I1216 14:59:08.304208 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857" exitCode=0 Dec 16 14:59:08 crc kubenswrapper[4775]: I1216 14:59:08.304271 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e" exitCode=0 Dec 16 14:59:08 crc kubenswrapper[4775]: I1216 14:59:08.304296 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e" exitCode=0 Dec 16 14:59:08 crc kubenswrapper[4775]: I1216 14:59:08.304311 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138" exitCode=2 Dec 16 14:59:08 crc kubenswrapper[4775]: I1216 14:59:08.304322 4775 scope.go:117] "RemoveContainer" containerID="7ac27c3ef28116003fd18250610f31b07e0ef7ae341d402197cf5f783a1412a6" Dec 16 14:59:08 crc kubenswrapper[4775]: I1216 14:59:08.307235 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"9cdda39e91cf4bead8a8025344cd0552633ad50212c2665431be084850260223"} Dec 16 14:59:08 crc kubenswrapper[4775]: I1216 14:59:08.307263 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6b24b3f25cb98a11a5fb738aaeab1556cf06ab3314b966849c92ae1776d0aa35"} Dec 16 14:59:08 crc kubenswrapper[4775]: I1216 14:59:08.308061 4775 status_manager.go:851] "Failed to get status for pod" podUID="68e5173d-4139-4745-bbb1-a20286fbf0f3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:08 crc kubenswrapper[4775]: I1216 14:59:08.308693 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.316088 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 14:59:09 crc kubenswrapper[4775]: E1216 14:59:09.335879 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:09 crc kubenswrapper[4775]: E1216 14:59:09.336252 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:09 crc kubenswrapper[4775]: E1216 14:59:09.336686 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:09 crc kubenswrapper[4775]: E1216 14:59:09.337975 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:09 crc kubenswrapper[4775]: E1216 14:59:09.338431 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.338492 4775 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 16 14:59:09 crc kubenswrapper[4775]: E1216 14:59:09.338945 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="200ms" Dec 16 14:59:09 crc kubenswrapper[4775]: E1216 14:59:09.549880 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="400ms" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.662110 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.662801 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.663188 4775 status_manager.go:851] "Failed to get status for pod" podUID="68e5173d-4139-4745-bbb1-a20286fbf0f3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.751747 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68e5173d-4139-4745-bbb1-a20286fbf0f3-kube-api-access\") pod \"68e5173d-4139-4745-bbb1-a20286fbf0f3\" (UID: \"68e5173d-4139-4745-bbb1-a20286fbf0f3\") " Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.752187 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/68e5173d-4139-4745-bbb1-a20286fbf0f3-var-lock\") pod \"68e5173d-4139-4745-bbb1-a20286fbf0f3\" (UID: \"68e5173d-4139-4745-bbb1-a20286fbf0f3\") " Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.752278 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68e5173d-4139-4745-bbb1-a20286fbf0f3-var-lock" (OuterVolumeSpecName: "var-lock") pod "68e5173d-4139-4745-bbb1-a20286fbf0f3" (UID: "68e5173d-4139-4745-bbb1-a20286fbf0f3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.752349 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68e5173d-4139-4745-bbb1-a20286fbf0f3-kubelet-dir\") pod \"68e5173d-4139-4745-bbb1-a20286fbf0f3\" (UID: \"68e5173d-4139-4745-bbb1-a20286fbf0f3\") " Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.752414 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68e5173d-4139-4745-bbb1-a20286fbf0f3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "68e5173d-4139-4745-bbb1-a20286fbf0f3" (UID: "68e5173d-4139-4745-bbb1-a20286fbf0f3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.752671 4775 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68e5173d-4139-4745-bbb1-a20286fbf0f3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.752685 4775 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/68e5173d-4139-4745-bbb1-a20286fbf0f3-var-lock\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.759203 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68e5173d-4139-4745-bbb1-a20286fbf0f3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "68e5173d-4139-4745-bbb1-a20286fbf0f3" (UID: "68e5173d-4139-4745-bbb1-a20286fbf0f3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.817058 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.817790 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.818361 4775 status_manager.go:851] "Failed to get status for pod" podUID="68e5173d-4139-4745-bbb1-a20286fbf0f3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.818582 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.818775 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.853607 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68e5173d-4139-4745-bbb1-a20286fbf0f3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:09 crc kubenswrapper[4775]: E1216 14:59:09.950968 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="800ms" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.954378 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.954468 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.954523 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.954791 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.954824 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 14:59:09 crc kubenswrapper[4775]: I1216 14:59:09.954840 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.055928 4775 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.055975 4775 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.055987 4775 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.326257 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"68e5173d-4139-4745-bbb1-a20286fbf0f3","Type":"ContainerDied","Data":"fff939026ac1e6efc91dd8317e6198d6794f3b2ef33a66aa2f50a57db6837cfb"} Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.326298 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fff939026ac1e6efc91dd8317e6198d6794f3b2ef33a66aa2f50a57db6837cfb" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.327659 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.328767 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.329374 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a" exitCode=0 Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.329423 4775 scope.go:117] "RemoveContainer" containerID="b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.329569 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.343972 4775 scope.go:117] "RemoveContainer" containerID="f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.348600 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.350149 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.350557 4775 status_manager.go:851] "Failed to get status for pod" podUID="68e5173d-4139-4745-bbb1-a20286fbf0f3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.359038 4775 scope.go:117] "RemoveContainer" containerID="b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.359141 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.359576 4775 status_manager.go:851] "Failed to get status for pod" podUID="68e5173d-4139-4745-bbb1-a20286fbf0f3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.359999 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.378163 4775 scope.go:117] "RemoveContainer" containerID="a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.399619 4775 scope.go:117] "RemoveContainer" containerID="43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.417362 4775 scope.go:117] "RemoveContainer" containerID="eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.435851 4775 scope.go:117] "RemoveContainer" containerID="b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857" Dec 16 14:59:10 crc kubenswrapper[4775]: E1216 14:59:10.436335 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\": container with ID starting with b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857 not found: ID does not exist" containerID="b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.436368 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857"} err="failed to get container status \"b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\": rpc error: code = NotFound desc = could not find container \"b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857\": container with ID starting with b77b3832bbdea1a4359ac82e545ea3199bcb776f26fc6af839a37c794a340857 not found: ID does not exist" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.436392 4775 scope.go:117] "RemoveContainer" containerID="f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e" Dec 16 14:59:10 crc kubenswrapper[4775]: E1216 14:59:10.437054 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\": container with ID starting with f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e not found: ID does not exist" containerID="f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.437076 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e"} err="failed to get container status \"f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\": rpc error: code = NotFound desc = could not find container \"f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e\": container with ID starting with f42391552cdce7a8f6232ccccb1ea8c8f8ca42231dc11d8486644a2ca8eb630e not found: ID does not exist" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.437088 4775 scope.go:117] "RemoveContainer" containerID="b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e" Dec 16 14:59:10 crc kubenswrapper[4775]: E1216 14:59:10.437309 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\": container with ID starting with b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e not found: ID does not exist" containerID="b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.437325 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e"} err="failed to get container status \"b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\": rpc error: code = NotFound desc = could not find container \"b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e\": container with ID starting with b4ba2da69e2cc1cfaa4e830841616db1258509eafdd9dea640e48e1384c8232e not found: ID does not exist" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.437337 4775 scope.go:117] "RemoveContainer" containerID="a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138" Dec 16 14:59:10 crc kubenswrapper[4775]: E1216 14:59:10.437571 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\": container with ID starting with a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138 not found: ID does not exist" containerID="a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.437588 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138"} err="failed to get container status \"a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\": rpc error: code = NotFound desc = could not find container \"a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138\": container with ID starting with a20db3b2e0e001b94e835b567a4ba06e1f45cf194155189e2f37e14bd26f7138 not found: ID does not exist" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.437600 4775 scope.go:117] "RemoveContainer" containerID="43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a" Dec 16 14:59:10 crc kubenswrapper[4775]: E1216 14:59:10.437813 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\": container with ID starting with 43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a not found: ID does not exist" containerID="43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.437830 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a"} err="failed to get container status \"43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\": rpc error: code = NotFound desc = could not find container \"43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a\": container with ID starting with 43737ad18064b3f2ce0cc7c53895353641684684e4a9a0990ab42e10537d3d8a not found: ID does not exist" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.437841 4775 scope.go:117] "RemoveContainer" containerID="eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a" Dec 16 14:59:10 crc kubenswrapper[4775]: E1216 14:59:10.438073 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\": container with ID starting with eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a not found: ID does not exist" containerID="eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a" Dec 16 14:59:10 crc kubenswrapper[4775]: I1216 14:59:10.438092 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a"} err="failed to get container status \"eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\": rpc error: code = NotFound desc = could not find container \"eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a\": container with ID starting with eb88e621cf1b9e5a4fcee745e8e3847025953f81fb3facb770f4565c41ca470a not found: ID does not exist" Dec 16 14:59:10 crc kubenswrapper[4775]: E1216 14:59:10.751918 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="1.6s" Dec 16 14:59:11 crc kubenswrapper[4775]: I1216 14:59:11.354822 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 16 14:59:12 crc kubenswrapper[4775]: E1216 14:59:12.353682 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="3.2s" Dec 16 14:59:15 crc kubenswrapper[4775]: I1216 14:59:15.340599 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:15 crc kubenswrapper[4775]: I1216 14:59:15.341493 4775 status_manager.go:851] "Failed to get status for pod" podUID="68e5173d-4139-4745-bbb1-a20286fbf0f3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:15 crc kubenswrapper[4775]: E1216 14:59:15.555052 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="6.4s" Dec 16 14:59:18 crc kubenswrapper[4775]: E1216 14:59:18.278991 4775 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1881ba19806e7f38 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-16 14:59:07.597619 +0000 UTC m=+272.548697913,LastTimestamp:2025-12-16 14:59:07.597619 +0000 UTC m=+272.548697913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 16 14:59:21 crc kubenswrapper[4775]: E1216 14:59:21.956718 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="7s" Dec 16 14:59:22 crc kubenswrapper[4775]: I1216 14:59:22.337155 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:22 crc kubenswrapper[4775]: I1216 14:59:22.338195 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:22 crc kubenswrapper[4775]: I1216 14:59:22.340395 4775 status_manager.go:851] "Failed to get status for pod" podUID="68e5173d-4139-4745-bbb1-a20286fbf0f3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:22 crc kubenswrapper[4775]: I1216 14:59:22.360439 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="74ffbb56-0462-4316-819d-a579a172cbea" Dec 16 14:59:22 crc kubenswrapper[4775]: I1216 14:59:22.360478 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="74ffbb56-0462-4316-819d-a579a172cbea" Dec 16 14:59:22 crc kubenswrapper[4775]: E1216 14:59:22.361673 4775 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:22 crc kubenswrapper[4775]: I1216 14:59:22.362319 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:22 crc kubenswrapper[4775]: I1216 14:59:22.403422 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 16 14:59:22 crc kubenswrapper[4775]: I1216 14:59:22.403504 4775 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a" exitCode=1 Dec 16 14:59:22 crc kubenswrapper[4775]: I1216 14:59:22.403630 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a"} Dec 16 14:59:22 crc kubenswrapper[4775]: I1216 14:59:22.404442 4775 status_manager.go:851] "Failed to get status for pod" podUID="68e5173d-4139-4745-bbb1-a20286fbf0f3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:22 crc kubenswrapper[4775]: I1216 14:59:22.404490 4775 scope.go:117] "RemoveContainer" containerID="089041827ca4a5341c026c4d7cd65083559dc7ebe4979caaad3907396046762a" Dec 16 14:59:22 crc kubenswrapper[4775]: I1216 14:59:22.404836 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:22 crc kubenswrapper[4775]: I1216 14:59:22.405217 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"11d50a20a31062fb2d4f7fb2589ba382c1051bd96baf48a6a6dad51c26b8ed8b"} Dec 16 14:59:22 crc kubenswrapper[4775]: I1216 14:59:22.405266 4775 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:23 crc kubenswrapper[4775]: I1216 14:59:23.411931 4775 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="f616211cc3757e8babe0a262aa789b9fb63c15e8c26cad22fee85f7a32148cf1" exitCode=0 Dec 16 14:59:23 crc kubenswrapper[4775]: I1216 14:59:23.412000 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"f616211cc3757e8babe0a262aa789b9fb63c15e8c26cad22fee85f7a32148cf1"} Dec 16 14:59:23 crc kubenswrapper[4775]: I1216 14:59:23.412112 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="74ffbb56-0462-4316-819d-a579a172cbea" Dec 16 14:59:23 crc kubenswrapper[4775]: I1216 14:59:23.412328 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="74ffbb56-0462-4316-819d-a579a172cbea" Dec 16 14:59:23 crc kubenswrapper[4775]: I1216 14:59:23.412593 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:23 crc kubenswrapper[4775]: E1216 14:59:23.412803 4775 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:23 crc kubenswrapper[4775]: I1216 14:59:23.413224 4775 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:23 crc kubenswrapper[4775]: I1216 14:59:23.413473 4775 status_manager.go:851] "Failed to get status for pod" podUID="68e5173d-4139-4745-bbb1-a20286fbf0f3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:23 crc kubenswrapper[4775]: I1216 14:59:23.415911 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 16 14:59:23 crc kubenswrapper[4775]: I1216 14:59:23.415957 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"80aea8f276f57f8734b537d40623350d0e0a4d9547ac90a0c017b17ce9ae765a"} Dec 16 14:59:23 crc kubenswrapper[4775]: I1216 14:59:23.416351 4775 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:23 crc kubenswrapper[4775]: I1216 14:59:23.416644 4775 status_manager.go:851] "Failed to get status for pod" podUID="68e5173d-4139-4745-bbb1-a20286fbf0f3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:23 crc kubenswrapper[4775]: I1216 14:59:23.416988 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:23 crc kubenswrapper[4775]: I1216 14:59:23.449557 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:59:23 crc kubenswrapper[4775]: I1216 14:59:23.453180 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:59:23 crc kubenswrapper[4775]: I1216 14:59:23.453599 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:23 crc kubenswrapper[4775]: I1216 14:59:23.453878 4775 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:23 crc kubenswrapper[4775]: I1216 14:59:23.454233 4775 status_manager.go:851] "Failed to get status for pod" podUID="68e5173d-4139-4745-bbb1-a20286fbf0f3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Dec 16 14:59:24 crc kubenswrapper[4775]: I1216 14:59:24.427295 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"89c9c1443869921c92c29917d9e1e82ee4e66c15aa28bd9c0521c802065dd071"} Dec 16 14:59:24 crc kubenswrapper[4775]: I1216 14:59:24.427655 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:59:24 crc kubenswrapper[4775]: I1216 14:59:24.427673 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4680c39580fdd91e660525de09d5c27a60c4a86e149657c1817b726f622eddb9"} Dec 16 14:59:24 crc kubenswrapper[4775]: I1216 14:59:24.427683 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"484e5962988d378202e4663b7913570604b8f70cde8e1899ee5754644c36fc89"} Dec 16 14:59:24 crc kubenswrapper[4775]: I1216 14:59:24.427695 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e12250504130b4d3d4ee493a5553d709b029e0235ee5fc66aca14b2bf842fff3"} Dec 16 14:59:25 crc kubenswrapper[4775]: I1216 14:59:25.435391 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="74ffbb56-0462-4316-819d-a579a172cbea" Dec 16 14:59:25 crc kubenswrapper[4775]: I1216 14:59:25.435429 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="74ffbb56-0462-4316-819d-a579a172cbea" Dec 16 14:59:25 crc kubenswrapper[4775]: I1216 14:59:25.435624 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"be65e21e00c0a4c98ae3d579d95640afaf39c0a24f1b217cd9732b710714e4e8"} Dec 16 14:59:25 crc kubenswrapper[4775]: I1216 14:59:25.435670 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:27 crc kubenswrapper[4775]: I1216 14:59:27.363058 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:27 crc kubenswrapper[4775]: I1216 14:59:27.363625 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:27 crc kubenswrapper[4775]: I1216 14:59:27.369935 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.152181 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" podUID="2b5f39f2-f4e2-4306-b64c-669ca82f8869" containerName="oauth-openshift" containerID="cri-o://0ac76ce0bbd6945ba21bf822d8c7383327a70e49e84cbe910367f5364d7d98a8" gracePeriod=15 Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.455866 4775 generic.go:334] "Generic (PLEG): container finished" podID="2b5f39f2-f4e2-4306-b64c-669ca82f8869" containerID="0ac76ce0bbd6945ba21bf822d8c7383327a70e49e84cbe910367f5364d7d98a8" exitCode=0 Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.456006 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" event={"ID":"2b5f39f2-f4e2-4306-b64c-669ca82f8869","Type":"ContainerDied","Data":"0ac76ce0bbd6945ba21bf822d8c7383327a70e49e84cbe910367f5364d7d98a8"} Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.638111 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.831228 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b5f39f2-f4e2-4306-b64c-669ca82f8869-audit-dir\") pod \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.831756 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-router-certs\") pod \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.831492 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b5f39f2-f4e2-4306-b64c-669ca82f8869-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "2b5f39f2-f4e2-4306-b64c-669ca82f8869" (UID: "2b5f39f2-f4e2-4306-b64c-669ca82f8869"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.831846 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-template-provider-selection\") pod \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.831983 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-trusted-ca-bundle\") pod \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.832064 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-service-ca\") pod \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.832118 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-cliconfig\") pod \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.832173 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-serving-cert\") pod \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.832248 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwl9g\" (UniqueName: \"kubernetes.io/projected/2b5f39f2-f4e2-4306-b64c-669ca82f8869-kube-api-access-lwl9g\") pod \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.832308 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-session\") pod \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.832384 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-idp-0-file-data\") pod \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.832453 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-template-error\") pod \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.832517 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-template-login\") pod \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.832573 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-ocp-branding-template\") pod \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.832641 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-audit-policies\") pod \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\" (UID: \"2b5f39f2-f4e2-4306-b64c-669ca82f8869\") " Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.833288 4775 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b5f39f2-f4e2-4306-b64c-669ca82f8869-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.834077 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "2b5f39f2-f4e2-4306-b64c-669ca82f8869" (UID: "2b5f39f2-f4e2-4306-b64c-669ca82f8869"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.834257 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "2b5f39f2-f4e2-4306-b64c-669ca82f8869" (UID: "2b5f39f2-f4e2-4306-b64c-669ca82f8869"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.834368 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "2b5f39f2-f4e2-4306-b64c-669ca82f8869" (UID: "2b5f39f2-f4e2-4306-b64c-669ca82f8869"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.835332 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "2b5f39f2-f4e2-4306-b64c-669ca82f8869" (UID: "2b5f39f2-f4e2-4306-b64c-669ca82f8869"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.838949 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "2b5f39f2-f4e2-4306-b64c-669ca82f8869" (UID: "2b5f39f2-f4e2-4306-b64c-669ca82f8869"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.839526 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "2b5f39f2-f4e2-4306-b64c-669ca82f8869" (UID: "2b5f39f2-f4e2-4306-b64c-669ca82f8869"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.839671 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "2b5f39f2-f4e2-4306-b64c-669ca82f8869" (UID: "2b5f39f2-f4e2-4306-b64c-669ca82f8869"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.840077 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "2b5f39f2-f4e2-4306-b64c-669ca82f8869" (UID: "2b5f39f2-f4e2-4306-b64c-669ca82f8869"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.840309 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "2b5f39f2-f4e2-4306-b64c-669ca82f8869" (UID: "2b5f39f2-f4e2-4306-b64c-669ca82f8869"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.840593 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b5f39f2-f4e2-4306-b64c-669ca82f8869-kube-api-access-lwl9g" (OuterVolumeSpecName: "kube-api-access-lwl9g") pod "2b5f39f2-f4e2-4306-b64c-669ca82f8869" (UID: "2b5f39f2-f4e2-4306-b64c-669ca82f8869"). InnerVolumeSpecName "kube-api-access-lwl9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.841026 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "2b5f39f2-f4e2-4306-b64c-669ca82f8869" (UID: "2b5f39f2-f4e2-4306-b64c-669ca82f8869"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.841285 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "2b5f39f2-f4e2-4306-b64c-669ca82f8869" (UID: "2b5f39f2-f4e2-4306-b64c-669ca82f8869"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.846131 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "2b5f39f2-f4e2-4306-b64c-669ca82f8869" (UID: "2b5f39f2-f4e2-4306-b64c-669ca82f8869"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.939006 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.939071 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.939095 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.939120 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.939144 4775 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.939164 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.939195 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.939226 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.939253 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.939279 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.939305 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.939335 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwl9g\" (UniqueName: \"kubernetes.io/projected/2b5f39f2-f4e2-4306-b64c-669ca82f8869-kube-api-access-lwl9g\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:28 crc kubenswrapper[4775]: I1216 14:59:28.939361 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2b5f39f2-f4e2-4306-b64c-669ca82f8869-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:29 crc kubenswrapper[4775]: I1216 14:59:29.463300 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" event={"ID":"2b5f39f2-f4e2-4306-b64c-669ca82f8869","Type":"ContainerDied","Data":"4d9a1b9ba38208559a24b67393be91b93f8c7ca0717a3308d006ad9eb7e44b84"} Dec 16 14:59:29 crc kubenswrapper[4775]: I1216 14:59:29.463365 4775 scope.go:117] "RemoveContainer" containerID="0ac76ce0bbd6945ba21bf822d8c7383327a70e49e84cbe910367f5364d7d98a8" Dec 16 14:59:29 crc kubenswrapper[4775]: I1216 14:59:29.463490 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ms9lk" Dec 16 14:59:30 crc kubenswrapper[4775]: I1216 14:59:30.446568 4775 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:30 crc kubenswrapper[4775]: I1216 14:59:30.476400 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74ffbb56-0462-4316-819d-a579a172cbea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e12250504130b4d3d4ee493a5553d709b029e0235ee5fc66aca14b2bf842fff3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4680c39580fdd91e660525de09d5c27a60c4a86e149657c1817b726f622eddb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://484e5962988d378202e4663b7913570604b8f70cde8e1899ee5754644c36fc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:59:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be65e21e00c0a4c98ae3d579d95640afaf39c0a24f1b217cd9732b710714e4e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:59:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9c1443869921c92c29917d9e1e82ee4e66c15aa28bd9c0521c802065dd071\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:59:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Pod \"kube-apiserver-crc\" is invalid: metadata.uid: Invalid value: \"74ffbb56-0462-4316-819d-a579a172cbea\": field is immutable" Dec 16 14:59:30 crc kubenswrapper[4775]: E1216 14:59:30.823524 4775 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-error\": Failed to watch *v1.Secret: unknown (get secrets)" logger="UnhandledError" Dec 16 14:59:31 crc kubenswrapper[4775]: I1216 14:59:31.474717 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="74ffbb56-0462-4316-819d-a579a172cbea" Dec 16 14:59:31 crc kubenswrapper[4775]: I1216 14:59:31.474758 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="74ffbb56-0462-4316-819d-a579a172cbea" Dec 16 14:59:31 crc kubenswrapper[4775]: I1216 14:59:31.479730 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:31 crc kubenswrapper[4775]: I1216 14:59:31.482957 4775 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="40945fff-5775-401a-8c16-a355093f746d" Dec 16 14:59:32 crc kubenswrapper[4775]: I1216 14:59:32.480329 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="74ffbb56-0462-4316-819d-a579a172cbea" Dec 16 14:59:32 crc kubenswrapper[4775]: I1216 14:59:32.480363 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="74ffbb56-0462-4316-819d-a579a172cbea" Dec 16 14:59:34 crc kubenswrapper[4775]: I1216 14:59:34.670344 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:59:35 crc kubenswrapper[4775]: I1216 14:59:35.206062 4775 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 16 14:59:35 crc kubenswrapper[4775]: I1216 14:59:35.359179 4775 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="40945fff-5775-401a-8c16-a355093f746d" Dec 16 14:59:39 crc kubenswrapper[4775]: I1216 14:59:39.636764 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 16 14:59:39 crc kubenswrapper[4775]: I1216 14:59:39.653729 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 16 14:59:39 crc kubenswrapper[4775]: I1216 14:59:39.881632 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 16 14:59:40 crc kubenswrapper[4775]: I1216 14:59:40.318738 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 16 14:59:40 crc kubenswrapper[4775]: I1216 14:59:40.591718 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 16 14:59:40 crc kubenswrapper[4775]: I1216 14:59:40.910706 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 16 14:59:40 crc kubenswrapper[4775]: I1216 14:59:40.918020 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 14:59:41 crc kubenswrapper[4775]: I1216 14:59:41.161335 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 16 14:59:41 crc kubenswrapper[4775]: I1216 14:59:41.350041 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 16 14:59:41 crc kubenswrapper[4775]: I1216 14:59:41.564699 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 16 14:59:41 crc kubenswrapper[4775]: I1216 14:59:41.581122 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 16 14:59:41 crc kubenswrapper[4775]: I1216 14:59:41.630693 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 16 14:59:41 crc kubenswrapper[4775]: I1216 14:59:41.641284 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 14:59:41 crc kubenswrapper[4775]: I1216 14:59:41.789220 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 16 14:59:41 crc kubenswrapper[4775]: I1216 14:59:41.808327 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 16 14:59:41 crc kubenswrapper[4775]: I1216 14:59:41.949178 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 16 14:59:41 crc kubenswrapper[4775]: I1216 14:59:41.964411 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 16 14:59:41 crc kubenswrapper[4775]: I1216 14:59:41.964971 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 16 14:59:42 crc kubenswrapper[4775]: I1216 14:59:42.051200 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 16 14:59:42 crc kubenswrapper[4775]: I1216 14:59:42.148036 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 16 14:59:42 crc kubenswrapper[4775]: I1216 14:59:42.155491 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 16 14:59:42 crc kubenswrapper[4775]: I1216 14:59:42.177291 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 16 14:59:42 crc kubenswrapper[4775]: I1216 14:59:42.278842 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 16 14:59:42 crc kubenswrapper[4775]: I1216 14:59:42.344147 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 16 14:59:42 crc kubenswrapper[4775]: I1216 14:59:42.362506 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 16 14:59:42 crc kubenswrapper[4775]: I1216 14:59:42.368576 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 16 14:59:42 crc kubenswrapper[4775]: I1216 14:59:42.429690 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 16 14:59:42 crc kubenswrapper[4775]: I1216 14:59:42.638769 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 16 14:59:42 crc kubenswrapper[4775]: I1216 14:59:42.658038 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 16 14:59:42 crc kubenswrapper[4775]: I1216 14:59:42.744675 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 16 14:59:42 crc kubenswrapper[4775]: I1216 14:59:42.811293 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 16 14:59:42 crc kubenswrapper[4775]: I1216 14:59:42.873046 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 16 14:59:42 crc kubenswrapper[4775]: I1216 14:59:42.970955 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 16 14:59:42 crc kubenswrapper[4775]: I1216 14:59:42.997803 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 16 14:59:43 crc kubenswrapper[4775]: I1216 14:59:43.052127 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 16 14:59:43 crc kubenswrapper[4775]: I1216 14:59:43.109524 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 16 14:59:43 crc kubenswrapper[4775]: I1216 14:59:43.122664 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 16 14:59:43 crc kubenswrapper[4775]: I1216 14:59:43.330301 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 16 14:59:43 crc kubenswrapper[4775]: I1216 14:59:43.562763 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 16 14:59:43 crc kubenswrapper[4775]: I1216 14:59:43.576773 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 16 14:59:43 crc kubenswrapper[4775]: I1216 14:59:43.591040 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 16 14:59:43 crc kubenswrapper[4775]: I1216 14:59:43.642880 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 16 14:59:43 crc kubenswrapper[4775]: I1216 14:59:43.694380 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 16 14:59:43 crc kubenswrapper[4775]: I1216 14:59:43.694677 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 16 14:59:43 crc kubenswrapper[4775]: I1216 14:59:43.723201 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 16 14:59:43 crc kubenswrapper[4775]: I1216 14:59:43.802570 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 16 14:59:43 crc kubenswrapper[4775]: I1216 14:59:43.828410 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 16 14:59:43 crc kubenswrapper[4775]: I1216 14:59:43.860305 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 16 14:59:43 crc kubenswrapper[4775]: I1216 14:59:43.901447 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 16 14:59:43 crc kubenswrapper[4775]: I1216 14:59:43.921973 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 16 14:59:43 crc kubenswrapper[4775]: I1216 14:59:43.998032 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 16 14:59:44 crc kubenswrapper[4775]: I1216 14:59:44.030801 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 16 14:59:44 crc kubenswrapper[4775]: I1216 14:59:44.141422 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 14:59:44 crc kubenswrapper[4775]: I1216 14:59:44.182866 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 16 14:59:44 crc kubenswrapper[4775]: I1216 14:59:44.276319 4775 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 16 14:59:44 crc kubenswrapper[4775]: I1216 14:59:44.277809 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 16 14:59:44 crc kubenswrapper[4775]: I1216 14:59:44.279999 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 16 14:59:44 crc kubenswrapper[4775]: I1216 14:59:44.287716 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 16 14:59:44 crc kubenswrapper[4775]: I1216 14:59:44.325697 4775 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 16 14:59:44 crc kubenswrapper[4775]: I1216 14:59:44.339457 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 16 14:59:44 crc kubenswrapper[4775]: I1216 14:59:44.378252 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 16 14:59:44 crc kubenswrapper[4775]: I1216 14:59:44.443249 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 16 14:59:44 crc kubenswrapper[4775]: I1216 14:59:44.475027 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 16 14:59:44 crc kubenswrapper[4775]: I1216 14:59:44.477085 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 16 14:59:44 crc kubenswrapper[4775]: I1216 14:59:44.566781 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:59:44 crc kubenswrapper[4775]: I1216 14:59:44.567835 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 16 14:59:44 crc kubenswrapper[4775]: I1216 14:59:44.588180 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 16 14:59:44 crc kubenswrapper[4775]: I1216 14:59:44.959275 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 16 14:59:44 crc kubenswrapper[4775]: I1216 14:59:44.986483 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 16 14:59:45 crc kubenswrapper[4775]: I1216 14:59:45.126571 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 16 14:59:45 crc kubenswrapper[4775]: I1216 14:59:45.130309 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 16 14:59:45 crc kubenswrapper[4775]: I1216 14:59:45.355354 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 16 14:59:45 crc kubenswrapper[4775]: I1216 14:59:45.361698 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 16 14:59:45 crc kubenswrapper[4775]: I1216 14:59:45.370879 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 16 14:59:45 crc kubenswrapper[4775]: I1216 14:59:45.419351 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 14:59:45 crc kubenswrapper[4775]: I1216 14:59:45.475977 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 16 14:59:45 crc kubenswrapper[4775]: I1216 14:59:45.569335 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 16 14:59:45 crc kubenswrapper[4775]: I1216 14:59:45.640912 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 16 14:59:45 crc kubenswrapper[4775]: I1216 14:59:45.854438 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 16 14:59:45 crc kubenswrapper[4775]: I1216 14:59:45.996565 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 16 14:59:46 crc kubenswrapper[4775]: I1216 14:59:46.054614 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 16 14:59:46 crc kubenswrapper[4775]: I1216 14:59:46.269391 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 16 14:59:46 crc kubenswrapper[4775]: I1216 14:59:46.299833 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 16 14:59:46 crc kubenswrapper[4775]: I1216 14:59:46.315760 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 16 14:59:46 crc kubenswrapper[4775]: I1216 14:59:46.385198 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 14:59:46 crc kubenswrapper[4775]: I1216 14:59:46.440347 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 16 14:59:46 crc kubenswrapper[4775]: I1216 14:59:46.511506 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 16 14:59:46 crc kubenswrapper[4775]: I1216 14:59:46.629465 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 16 14:59:46 crc kubenswrapper[4775]: I1216 14:59:46.632369 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 16 14:59:46 crc kubenswrapper[4775]: I1216 14:59:46.651030 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 16 14:59:46 crc kubenswrapper[4775]: I1216 14:59:46.665043 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 16 14:59:46 crc kubenswrapper[4775]: I1216 14:59:46.684755 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 16 14:59:46 crc kubenswrapper[4775]: I1216 14:59:46.737422 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 16 14:59:46 crc kubenswrapper[4775]: I1216 14:59:46.828512 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 16 14:59:46 crc kubenswrapper[4775]: I1216 14:59:46.930160 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.004448 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.024076 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.077591 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.163558 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.174182 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.175824 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.182098 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.286330 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.303863 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.304415 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.328200 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.448987 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.467774 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.590593 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.604945 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.605018 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.751984 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.812043 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.821607 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.832270 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.837147 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.900974 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 16 14:59:47 crc kubenswrapper[4775]: I1216 14:59:47.965942 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 16 14:59:48 crc kubenswrapper[4775]: I1216 14:59:48.039364 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 16 14:59:48 crc kubenswrapper[4775]: I1216 14:59:48.154226 4775 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 16 14:59:48 crc kubenswrapper[4775]: I1216 14:59:48.264626 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 16 14:59:48 crc kubenswrapper[4775]: I1216 14:59:48.353500 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 16 14:59:48 crc kubenswrapper[4775]: I1216 14:59:48.355271 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 16 14:59:48 crc kubenswrapper[4775]: I1216 14:59:48.368729 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 16 14:59:48 crc kubenswrapper[4775]: I1216 14:59:48.419873 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 16 14:59:48 crc kubenswrapper[4775]: I1216 14:59:48.576529 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 16 14:59:48 crc kubenswrapper[4775]: I1216 14:59:48.676186 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 16 14:59:48 crc kubenswrapper[4775]: I1216 14:59:48.718485 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 16 14:59:48 crc kubenswrapper[4775]: I1216 14:59:48.759550 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 16 14:59:48 crc kubenswrapper[4775]: I1216 14:59:48.805707 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 16 14:59:48 crc kubenswrapper[4775]: I1216 14:59:48.827422 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.026240 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.047006 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.079334 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.141132 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.345727 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.365495 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.391224 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.491006 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.517003 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.654536 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.685051 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.744279 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.774073 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.779151 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.791590 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.829924 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.892555 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.919839 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.936740 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.986543 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 16 14:59:49 crc kubenswrapper[4775]: I1216 14:59:49.998219 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.002165 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.032464 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.075483 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.117604 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.216106 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.279556 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.328899 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.382657 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.386244 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.423702 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.429647 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.459650 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.497004 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.632240 4775 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.650741 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.651153 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.707763 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.775017 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.854577 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.961264 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 16 14:59:50 crc kubenswrapper[4775]: I1216 14:59:50.970109 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 16 14:59:51 crc kubenswrapper[4775]: I1216 14:59:51.023308 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 16 14:59:51 crc kubenswrapper[4775]: I1216 14:59:51.042386 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 16 14:59:51 crc kubenswrapper[4775]: I1216 14:59:51.202628 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 16 14:59:51 crc kubenswrapper[4775]: I1216 14:59:51.262584 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 16 14:59:51 crc kubenswrapper[4775]: I1216 14:59:51.302734 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 16 14:59:51 crc kubenswrapper[4775]: I1216 14:59:51.326116 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 16 14:59:51 crc kubenswrapper[4775]: I1216 14:59:51.341557 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 14:59:51 crc kubenswrapper[4775]: I1216 14:59:51.364981 4775 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 16 14:59:51 crc kubenswrapper[4775]: I1216 14:59:51.376213 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 16 14:59:51 crc kubenswrapper[4775]: I1216 14:59:51.478672 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 16 14:59:51 crc kubenswrapper[4775]: I1216 14:59:51.517206 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 16 14:59:51 crc kubenswrapper[4775]: I1216 14:59:51.531724 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 16 14:59:51 crc kubenswrapper[4775]: I1216 14:59:51.539125 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 16 14:59:51 crc kubenswrapper[4775]: I1216 14:59:51.601292 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 16 14:59:51 crc kubenswrapper[4775]: I1216 14:59:51.712510 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 16 14:59:51 crc kubenswrapper[4775]: I1216 14:59:51.794828 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 16 14:59:51 crc kubenswrapper[4775]: I1216 14:59:51.825702 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 16 14:59:51 crc kubenswrapper[4775]: I1216 14:59:51.906503 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 16 14:59:52 crc kubenswrapper[4775]: I1216 14:59:52.070422 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 14:59:52 crc kubenswrapper[4775]: I1216 14:59:52.214683 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 16 14:59:52 crc kubenswrapper[4775]: I1216 14:59:52.222822 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 16 14:59:52 crc kubenswrapper[4775]: I1216 14:59:52.273705 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 16 14:59:52 crc kubenswrapper[4775]: I1216 14:59:52.329561 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 16 14:59:52 crc kubenswrapper[4775]: I1216 14:59:52.411658 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 14:59:52 crc kubenswrapper[4775]: I1216 14:59:52.451088 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 16 14:59:52 crc kubenswrapper[4775]: I1216 14:59:52.473945 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 16 14:59:52 crc kubenswrapper[4775]: I1216 14:59:52.592856 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 16 14:59:52 crc kubenswrapper[4775]: I1216 14:59:52.603411 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 16 14:59:52 crc kubenswrapper[4775]: I1216 14:59:52.641538 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 14:59:52 crc kubenswrapper[4775]: I1216 14:59:52.678650 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 16 14:59:52 crc kubenswrapper[4775]: I1216 14:59:52.738770 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 16 14:59:52 crc kubenswrapper[4775]: I1216 14:59:52.795157 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 16 14:59:52 crc kubenswrapper[4775]: I1216 14:59:52.838335 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 16 14:59:52 crc kubenswrapper[4775]: I1216 14:59:52.839950 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 16 14:59:52 crc kubenswrapper[4775]: I1216 14:59:52.903460 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 16 14:59:53 crc kubenswrapper[4775]: I1216 14:59:53.000816 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 16 14:59:53 crc kubenswrapper[4775]: I1216 14:59:53.031730 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 16 14:59:53 crc kubenswrapper[4775]: I1216 14:59:53.060858 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 16 14:59:53 crc kubenswrapper[4775]: I1216 14:59:53.239037 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 16 14:59:53 crc kubenswrapper[4775]: I1216 14:59:53.267833 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 16 14:59:53 crc kubenswrapper[4775]: I1216 14:59:53.289571 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 16 14:59:53 crc kubenswrapper[4775]: I1216 14:59:53.441147 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 16 14:59:53 crc kubenswrapper[4775]: I1216 14:59:53.577731 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 16 14:59:53 crc kubenswrapper[4775]: I1216 14:59:53.655809 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 16 14:59:53 crc kubenswrapper[4775]: I1216 14:59:53.675797 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 16 14:59:53 crc kubenswrapper[4775]: I1216 14:59:53.796401 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 16 14:59:53 crc kubenswrapper[4775]: I1216 14:59:53.884849 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 16 14:59:53 crc kubenswrapper[4775]: I1216 14:59:53.909472 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 16 14:59:53 crc kubenswrapper[4775]: I1216 14:59:53.979557 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.306215 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.370662 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.521077 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.757987 4775 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.758517 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=47.758503225 podStartE2EDuration="47.758503225s" podCreationTimestamp="2025-12-16 14:59:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:30.427277848 +0000 UTC m=+295.378356791" watchObservedRunningTime="2025-12-16 14:59:54.758503225 +0000 UTC m=+319.709582148" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.762297 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ms9lk","openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.762347 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-58bd64d8b5-wplht"] Dec 16 14:59:54 crc kubenswrapper[4775]: E1216 14:59:54.762538 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b5f39f2-f4e2-4306-b64c-669ca82f8869" containerName="oauth-openshift" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.762559 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5f39f2-f4e2-4306-b64c-669ca82f8869" containerName="oauth-openshift" Dec 16 14:59:54 crc kubenswrapper[4775]: E1216 14:59:54.762579 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68e5173d-4139-4745-bbb1-a20286fbf0f3" containerName="installer" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.762586 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="68e5173d-4139-4745-bbb1-a20286fbf0f3" containerName="installer" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.762679 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="68e5173d-4139-4745-bbb1-a20286fbf0f3" containerName="installer" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.762688 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b5f39f2-f4e2-4306-b64c-669ca82f8869" containerName="oauth-openshift" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.763106 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.769406 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.769708 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.770396 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.770534 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.771670 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.773575 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.774492 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.774685 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.774876 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.776479 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.778905 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.793698 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.794257 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.794272 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.794405 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.796917 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.798112 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-session\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.798141 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-router-certs\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.798164 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v9tv\" (UniqueName: \"kubernetes.io/projected/fa2099f6-180a-4c53-9a2e-6d7733162357-kube-api-access-9v9tv\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.798238 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.798277 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fa2099f6-180a-4c53-9a2e-6d7733162357-audit-dir\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.798294 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.798379 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fa2099f6-180a-4c53-9a2e-6d7733162357-audit-policies\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.798428 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.798462 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-user-template-error\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.798486 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-service-ca\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.798502 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.798530 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.798597 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-user-template-login\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.798622 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.805816 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.823772 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.835909 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.860558 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.860540009 podStartE2EDuration="24.860540009s" podCreationTimestamp="2025-12-16 14:59:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:54.833339989 +0000 UTC m=+319.784418912" watchObservedRunningTime="2025-12-16 14:59:54.860540009 +0000 UTC m=+319.811618932" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.899421 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-user-template-login\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.899480 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.899510 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-session\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.899539 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-router-certs\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.899564 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v9tv\" (UniqueName: \"kubernetes.io/projected/fa2099f6-180a-4c53-9a2e-6d7733162357-kube-api-access-9v9tv\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.899605 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.899640 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fa2099f6-180a-4c53-9a2e-6d7733162357-audit-dir\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.899663 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.899700 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fa2099f6-180a-4c53-9a2e-6d7733162357-audit-policies\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.899733 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.899770 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-user-template-error\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.899801 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-service-ca\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.899826 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.899830 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fa2099f6-180a-4c53-9a2e-6d7733162357-audit-dir\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.899862 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.900763 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.900768 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fa2099f6-180a-4c53-9a2e-6d7733162357-audit-policies\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.901440 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-service-ca\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.901560 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.905295 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.905748 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-session\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.908195 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-router-certs\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.908668 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.909170 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.909423 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-user-template-error\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.910033 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-user-template-login\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.916906 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fa2099f6-180a-4c53-9a2e-6d7733162357-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:54 crc kubenswrapper[4775]: I1216 14:59:54.922750 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v9tv\" (UniqueName: \"kubernetes.io/projected/fa2099f6-180a-4c53-9a2e-6d7733162357-kube-api-access-9v9tv\") pod \"oauth-openshift-58bd64d8b5-wplht\" (UID: \"fa2099f6-180a-4c53-9a2e-6d7733162357\") " pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:55 crc kubenswrapper[4775]: I1216 14:59:55.081814 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:55 crc kubenswrapper[4775]: I1216 14:59:55.345616 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b5f39f2-f4e2-4306-b64c-669ca82f8869" path="/var/lib/kubelet/pods/2b5f39f2-f4e2-4306-b64c-669ca82f8869/volumes" Dec 16 14:59:55 crc kubenswrapper[4775]: I1216 14:59:55.361945 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 16 14:59:55 crc kubenswrapper[4775]: I1216 14:59:55.398728 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 16 14:59:55 crc kubenswrapper[4775]: I1216 14:59:55.488042 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-58bd64d8b5-wplht"] Dec 16 14:59:55 crc kubenswrapper[4775]: I1216 14:59:55.628568 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" event={"ID":"fa2099f6-180a-4c53-9a2e-6d7733162357","Type":"ContainerStarted","Data":"813600537f9088af38fcb6566a5548d26c93e5f6abb608ad70c11d63b5c122d1"} Dec 16 14:59:55 crc kubenswrapper[4775]: I1216 14:59:55.710829 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 16 14:59:55 crc kubenswrapper[4775]: I1216 14:59:55.717451 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-767c66895f-dfn2j"] Dec 16 14:59:55 crc kubenswrapper[4775]: I1216 14:59:55.717733 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" podUID="d009283a-9bbd-455c-b5c2-4ed6c3336b52" containerName="controller-manager" containerID="cri-o://71a5256bd60c43445f1d19512f8df0dd1cce666a9a3e4a4e6f5acaff757aad4b" gracePeriod=30 Dec 16 14:59:55 crc kubenswrapper[4775]: I1216 14:59:55.819436 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk"] Dec 16 14:59:55 crc kubenswrapper[4775]: I1216 14:59:55.819744 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" podUID="61dbff96-b237-4949-99e1-774bcec682f4" containerName="route-controller-manager" containerID="cri-o://7eeb6757a6a2e10e762779d8e360b7b17a41e0a5fdf191620a2d7da708270806" gracePeriod=30 Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.195480 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.320042 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61dbff96-b237-4949-99e1-774bcec682f4-serving-cert\") pod \"61dbff96-b237-4949-99e1-774bcec682f4\" (UID: \"61dbff96-b237-4949-99e1-774bcec682f4\") " Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.320209 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsj62\" (UniqueName: \"kubernetes.io/projected/61dbff96-b237-4949-99e1-774bcec682f4-kube-api-access-zsj62\") pod \"61dbff96-b237-4949-99e1-774bcec682f4\" (UID: \"61dbff96-b237-4949-99e1-774bcec682f4\") " Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.320290 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61dbff96-b237-4949-99e1-774bcec682f4-client-ca\") pod \"61dbff96-b237-4949-99e1-774bcec682f4\" (UID: \"61dbff96-b237-4949-99e1-774bcec682f4\") " Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.320336 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61dbff96-b237-4949-99e1-774bcec682f4-config\") pod \"61dbff96-b237-4949-99e1-774bcec682f4\" (UID: \"61dbff96-b237-4949-99e1-774bcec682f4\") " Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.321426 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61dbff96-b237-4949-99e1-774bcec682f4-config" (OuterVolumeSpecName: "config") pod "61dbff96-b237-4949-99e1-774bcec682f4" (UID: "61dbff96-b237-4949-99e1-774bcec682f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.321477 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61dbff96-b237-4949-99e1-774bcec682f4-client-ca" (OuterVolumeSpecName: "client-ca") pod "61dbff96-b237-4949-99e1-774bcec682f4" (UID: "61dbff96-b237-4949-99e1-774bcec682f4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.326833 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61dbff96-b237-4949-99e1-774bcec682f4-kube-api-access-zsj62" (OuterVolumeSpecName: "kube-api-access-zsj62") pod "61dbff96-b237-4949-99e1-774bcec682f4" (UID: "61dbff96-b237-4949-99e1-774bcec682f4"). InnerVolumeSpecName "kube-api-access-zsj62". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.327083 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61dbff96-b237-4949-99e1-774bcec682f4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "61dbff96-b237-4949-99e1-774bcec682f4" (UID: "61dbff96-b237-4949-99e1-774bcec682f4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.422566 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61dbff96-b237-4949-99e1-774bcec682f4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.422597 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsj62\" (UniqueName: \"kubernetes.io/projected/61dbff96-b237-4949-99e1-774bcec682f4-kube-api-access-zsj62\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.422614 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61dbff96-b237-4949-99e1-774bcec682f4-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.422627 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61dbff96-b237-4949-99e1-774bcec682f4-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.563333 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.636767 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" event={"ID":"fa2099f6-180a-4c53-9a2e-6d7733162357","Type":"ContainerStarted","Data":"1de009ab86e87304d462392d530432133b7b1ced9890c5259b37a0b2ebbb631f"} Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.637088 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.638211 4775 generic.go:334] "Generic (PLEG): container finished" podID="61dbff96-b237-4949-99e1-774bcec682f4" containerID="7eeb6757a6a2e10e762779d8e360b7b17a41e0a5fdf191620a2d7da708270806" exitCode=0 Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.638270 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.638299 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" event={"ID":"61dbff96-b237-4949-99e1-774bcec682f4","Type":"ContainerDied","Data":"7eeb6757a6a2e10e762779d8e360b7b17a41e0a5fdf191620a2d7da708270806"} Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.638373 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk" event={"ID":"61dbff96-b237-4949-99e1-774bcec682f4","Type":"ContainerDied","Data":"92e524de76350cc104ceaf1c6a957f3f463b1cbf124ce7688a67da11d290c05b"} Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.638395 4775 scope.go:117] "RemoveContainer" containerID="7eeb6757a6a2e10e762779d8e360b7b17a41e0a5fdf191620a2d7da708270806" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.639826 4775 generic.go:334] "Generic (PLEG): container finished" podID="d009283a-9bbd-455c-b5c2-4ed6c3336b52" containerID="71a5256bd60c43445f1d19512f8df0dd1cce666a9a3e4a4e6f5acaff757aad4b" exitCode=0 Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.639859 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" event={"ID":"d009283a-9bbd-455c-b5c2-4ed6c3336b52","Type":"ContainerDied","Data":"71a5256bd60c43445f1d19512f8df0dd1cce666a9a3e4a4e6f5acaff757aad4b"} Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.639879 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" event={"ID":"d009283a-9bbd-455c-b5c2-4ed6c3336b52","Type":"ContainerDied","Data":"4847f85f823721dfe69bfd288f58c3206e85c9da19d07125de3aaaa8e9262c46"} Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.639959 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-767c66895f-dfn2j" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.648217 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.657387 4775 scope.go:117] "RemoveContainer" containerID="7eeb6757a6a2e10e762779d8e360b7b17a41e0a5fdf191620a2d7da708270806" Dec 16 14:59:56 crc kubenswrapper[4775]: E1216 14:59:56.659191 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eeb6757a6a2e10e762779d8e360b7b17a41e0a5fdf191620a2d7da708270806\": container with ID starting with 7eeb6757a6a2e10e762779d8e360b7b17a41e0a5fdf191620a2d7da708270806 not found: ID does not exist" containerID="7eeb6757a6a2e10e762779d8e360b7b17a41e0a5fdf191620a2d7da708270806" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.659257 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eeb6757a6a2e10e762779d8e360b7b17a41e0a5fdf191620a2d7da708270806"} err="failed to get container status \"7eeb6757a6a2e10e762779d8e360b7b17a41e0a5fdf191620a2d7da708270806\": rpc error: code = NotFound desc = could not find container \"7eeb6757a6a2e10e762779d8e360b7b17a41e0a5fdf191620a2d7da708270806\": container with ID starting with 7eeb6757a6a2e10e762779d8e360b7b17a41e0a5fdf191620a2d7da708270806 not found: ID does not exist" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.659291 4775 scope.go:117] "RemoveContainer" containerID="71a5256bd60c43445f1d19512f8df0dd1cce666a9a3e4a4e6f5acaff757aad4b" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.662118 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-58bd64d8b5-wplht" podStartSLOduration=53.662099716 podStartE2EDuration="53.662099716s" podCreationTimestamp="2025-12-16 14:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:56.659291396 +0000 UTC m=+321.610370329" watchObservedRunningTime="2025-12-16 14:59:56.662099716 +0000 UTC m=+321.613178639" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.672861 4775 scope.go:117] "RemoveContainer" containerID="71a5256bd60c43445f1d19512f8df0dd1cce666a9a3e4a4e6f5acaff757aad4b" Dec 16 14:59:56 crc kubenswrapper[4775]: E1216 14:59:56.673441 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71a5256bd60c43445f1d19512f8df0dd1cce666a9a3e4a4e6f5acaff757aad4b\": container with ID starting with 71a5256bd60c43445f1d19512f8df0dd1cce666a9a3e4a4e6f5acaff757aad4b not found: ID does not exist" containerID="71a5256bd60c43445f1d19512f8df0dd1cce666a9a3e4a4e6f5acaff757aad4b" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.673521 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a5256bd60c43445f1d19512f8df0dd1cce666a9a3e4a4e6f5acaff757aad4b"} err="failed to get container status \"71a5256bd60c43445f1d19512f8df0dd1cce666a9a3e4a4e6f5acaff757aad4b\": rpc error: code = NotFound desc = could not find container \"71a5256bd60c43445f1d19512f8df0dd1cce666a9a3e4a4e6f5acaff757aad4b\": container with ID starting with 71a5256bd60c43445f1d19512f8df0dd1cce666a9a3e4a4e6f5acaff757aad4b not found: ID does not exist" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.691823 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk"] Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.699638 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-775d89cb8f-976qk"] Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.726365 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d009283a-9bbd-455c-b5c2-4ed6c3336b52-client-ca\") pod \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.726431 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4ftw\" (UniqueName: \"kubernetes.io/projected/d009283a-9bbd-455c-b5c2-4ed6c3336b52-kube-api-access-v4ftw\") pod \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.727034 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d009283a-9bbd-455c-b5c2-4ed6c3336b52-config\") pod \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.727077 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d009283a-9bbd-455c-b5c2-4ed6c3336b52-serving-cert\") pod \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.727111 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d009283a-9bbd-455c-b5c2-4ed6c3336b52-proxy-ca-bundles\") pod \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\" (UID: \"d009283a-9bbd-455c-b5c2-4ed6c3336b52\") " Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.727503 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d009283a-9bbd-455c-b5c2-4ed6c3336b52-client-ca" (OuterVolumeSpecName: "client-ca") pod "d009283a-9bbd-455c-b5c2-4ed6c3336b52" (UID: "d009283a-9bbd-455c-b5c2-4ed6c3336b52"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.727721 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d009283a-9bbd-455c-b5c2-4ed6c3336b52-config" (OuterVolumeSpecName: "config") pod "d009283a-9bbd-455c-b5c2-4ed6c3336b52" (UID: "d009283a-9bbd-455c-b5c2-4ed6c3336b52"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.728077 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d009283a-9bbd-455c-b5c2-4ed6c3336b52-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d009283a-9bbd-455c-b5c2-4ed6c3336b52" (UID: "d009283a-9bbd-455c-b5c2-4ed6c3336b52"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.730606 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d009283a-9bbd-455c-b5c2-4ed6c3336b52-kube-api-access-v4ftw" (OuterVolumeSpecName: "kube-api-access-v4ftw") pod "d009283a-9bbd-455c-b5c2-4ed6c3336b52" (UID: "d009283a-9bbd-455c-b5c2-4ed6c3336b52"). InnerVolumeSpecName "kube-api-access-v4ftw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.730690 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d009283a-9bbd-455c-b5c2-4ed6c3336b52-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d009283a-9bbd-455c-b5c2-4ed6c3336b52" (UID: "d009283a-9bbd-455c-b5c2-4ed6c3336b52"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.828080 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d009283a-9bbd-455c-b5c2-4ed6c3336b52-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.828134 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4ftw\" (UniqueName: \"kubernetes.io/projected/d009283a-9bbd-455c-b5c2-4ed6c3336b52-kube-api-access-v4ftw\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.828149 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d009283a-9bbd-455c-b5c2-4ed6c3336b52-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.828160 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d009283a-9bbd-455c-b5c2-4ed6c3336b52-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.828170 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d009283a-9bbd-455c-b5c2-4ed6c3336b52-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.965412 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-767c66895f-dfn2j"] Dec 16 14:59:56 crc kubenswrapper[4775]: I1216 14:59:56.969057 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-767c66895f-dfn2j"] Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.304836 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54b779b799-tn5gf"] Dec 16 14:59:57 crc kubenswrapper[4775]: E1216 14:59:57.305214 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61dbff96-b237-4949-99e1-774bcec682f4" containerName="route-controller-manager" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.305236 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="61dbff96-b237-4949-99e1-774bcec682f4" containerName="route-controller-manager" Dec 16 14:59:57 crc kubenswrapper[4775]: E1216 14:59:57.305257 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d009283a-9bbd-455c-b5c2-4ed6c3336b52" containerName="controller-manager" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.305267 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d009283a-9bbd-455c-b5c2-4ed6c3336b52" containerName="controller-manager" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.305408 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="61dbff96-b237-4949-99e1-774bcec682f4" containerName="route-controller-manager" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.305427 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d009283a-9bbd-455c-b5c2-4ed6c3336b52" containerName="controller-manager" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.306009 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.308248 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.308410 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.309182 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.309345 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr"] Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.309658 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.309704 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.309843 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.310224 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.314271 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.314793 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.315121 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.316342 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.316446 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.316472 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.317183 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.320538 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54b779b799-tn5gf"] Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.323804 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr"] Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.335112 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e75c6912-dc4b-45f0-afa8-d63b604106e6-serving-cert\") pod \"controller-manager-54b779b799-tn5gf\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.335155 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e75c6912-dc4b-45f0-afa8-d63b604106e6-client-ca\") pod \"controller-manager-54b779b799-tn5gf\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.335179 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b14567a-6ecf-4c94-a972-0ac76733116e-config\") pod \"route-controller-manager-6485597c9c-tc4sr\" (UID: \"3b14567a-6ecf-4c94-a972-0ac76733116e\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.335208 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcljg\" (UniqueName: \"kubernetes.io/projected/e75c6912-dc4b-45f0-afa8-d63b604106e6-kube-api-access-qcljg\") pod \"controller-manager-54b779b799-tn5gf\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.335229 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e75c6912-dc4b-45f0-afa8-d63b604106e6-config\") pod \"controller-manager-54b779b799-tn5gf\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.335252 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b14567a-6ecf-4c94-a972-0ac76733116e-client-ca\") pod \"route-controller-manager-6485597c9c-tc4sr\" (UID: \"3b14567a-6ecf-4c94-a972-0ac76733116e\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.335276 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z6gf\" (UniqueName: \"kubernetes.io/projected/3b14567a-6ecf-4c94-a972-0ac76733116e-kube-api-access-7z6gf\") pod \"route-controller-manager-6485597c9c-tc4sr\" (UID: \"3b14567a-6ecf-4c94-a972-0ac76733116e\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.335297 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e75c6912-dc4b-45f0-afa8-d63b604106e6-proxy-ca-bundles\") pod \"controller-manager-54b779b799-tn5gf\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.335314 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b14567a-6ecf-4c94-a972-0ac76733116e-serving-cert\") pod \"route-controller-manager-6485597c9c-tc4sr\" (UID: \"3b14567a-6ecf-4c94-a972-0ac76733116e\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.345693 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61dbff96-b237-4949-99e1-774bcec682f4" path="/var/lib/kubelet/pods/61dbff96-b237-4949-99e1-774bcec682f4/volumes" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.346384 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d009283a-9bbd-455c-b5c2-4ed6c3336b52" path="/var/lib/kubelet/pods/d009283a-9bbd-455c-b5c2-4ed6c3336b52/volumes" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.435972 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e75c6912-dc4b-45f0-afa8-d63b604106e6-config\") pod \"controller-manager-54b779b799-tn5gf\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.436028 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b14567a-6ecf-4c94-a972-0ac76733116e-client-ca\") pod \"route-controller-manager-6485597c9c-tc4sr\" (UID: \"3b14567a-6ecf-4c94-a972-0ac76733116e\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.436059 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z6gf\" (UniqueName: \"kubernetes.io/projected/3b14567a-6ecf-4c94-a972-0ac76733116e-kube-api-access-7z6gf\") pod \"route-controller-manager-6485597c9c-tc4sr\" (UID: \"3b14567a-6ecf-4c94-a972-0ac76733116e\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.436082 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e75c6912-dc4b-45f0-afa8-d63b604106e6-proxy-ca-bundles\") pod \"controller-manager-54b779b799-tn5gf\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.436100 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b14567a-6ecf-4c94-a972-0ac76733116e-serving-cert\") pod \"route-controller-manager-6485597c9c-tc4sr\" (UID: \"3b14567a-6ecf-4c94-a972-0ac76733116e\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.436130 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e75c6912-dc4b-45f0-afa8-d63b604106e6-serving-cert\") pod \"controller-manager-54b779b799-tn5gf\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.436146 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e75c6912-dc4b-45f0-afa8-d63b604106e6-client-ca\") pod \"controller-manager-54b779b799-tn5gf\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.436172 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b14567a-6ecf-4c94-a972-0ac76733116e-config\") pod \"route-controller-manager-6485597c9c-tc4sr\" (UID: \"3b14567a-6ecf-4c94-a972-0ac76733116e\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.436213 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcljg\" (UniqueName: \"kubernetes.io/projected/e75c6912-dc4b-45f0-afa8-d63b604106e6-kube-api-access-qcljg\") pod \"controller-manager-54b779b799-tn5gf\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.438398 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e75c6912-dc4b-45f0-afa8-d63b604106e6-config\") pod \"controller-manager-54b779b799-tn5gf\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.439130 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b14567a-6ecf-4c94-a972-0ac76733116e-client-ca\") pod \"route-controller-manager-6485597c9c-tc4sr\" (UID: \"3b14567a-6ecf-4c94-a972-0ac76733116e\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.440277 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e75c6912-dc4b-45f0-afa8-d63b604106e6-proxy-ca-bundles\") pod \"controller-manager-54b779b799-tn5gf\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.441409 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e75c6912-dc4b-45f0-afa8-d63b604106e6-client-ca\") pod \"controller-manager-54b779b799-tn5gf\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.442086 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b14567a-6ecf-4c94-a972-0ac76733116e-config\") pod \"route-controller-manager-6485597c9c-tc4sr\" (UID: \"3b14567a-6ecf-4c94-a972-0ac76733116e\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.444588 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e75c6912-dc4b-45f0-afa8-d63b604106e6-serving-cert\") pod \"controller-manager-54b779b799-tn5gf\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.445087 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b14567a-6ecf-4c94-a972-0ac76733116e-serving-cert\") pod \"route-controller-manager-6485597c9c-tc4sr\" (UID: \"3b14567a-6ecf-4c94-a972-0ac76733116e\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.450661 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcljg\" (UniqueName: \"kubernetes.io/projected/e75c6912-dc4b-45f0-afa8-d63b604106e6-kube-api-access-qcljg\") pod \"controller-manager-54b779b799-tn5gf\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.454520 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z6gf\" (UniqueName: \"kubernetes.io/projected/3b14567a-6ecf-4c94-a972-0ac76733116e-kube-api-access-7z6gf\") pod \"route-controller-manager-6485597c9c-tc4sr\" (UID: \"3b14567a-6ecf-4c94-a972-0ac76733116e\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.638711 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.646176 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" Dec 16 14:59:57 crc kubenswrapper[4775]: I1216 14:59:57.925823 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr"] Dec 16 14:59:57 crc kubenswrapper[4775]: W1216 14:59:57.931824 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b14567a_6ecf_4c94_a972_0ac76733116e.slice/crio-f82ea5e3b3ce20316f37020f38816057a2835e114ad3360452359bf61bb8a69c WatchSource:0}: Error finding container f82ea5e3b3ce20316f37020f38816057a2835e114ad3360452359bf61bb8a69c: Status 404 returned error can't find the container with id f82ea5e3b3ce20316f37020f38816057a2835e114ad3360452359bf61bb8a69c Dec 16 14:59:58 crc kubenswrapper[4775]: I1216 14:59:58.084357 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54b779b799-tn5gf"] Dec 16 14:59:58 crc kubenswrapper[4775]: W1216 14:59:58.088413 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode75c6912_dc4b_45f0_afa8_d63b604106e6.slice/crio-3bb8087152f67f4972ba8a632738dfdf1cc323d30cefd5f55c0ae9176c5946d5 WatchSource:0}: Error finding container 3bb8087152f67f4972ba8a632738dfdf1cc323d30cefd5f55c0ae9176c5946d5: Status 404 returned error can't find the container with id 3bb8087152f67f4972ba8a632738dfdf1cc323d30cefd5f55c0ae9176c5946d5 Dec 16 14:59:58 crc kubenswrapper[4775]: I1216 14:59:58.659365 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" event={"ID":"e75c6912-dc4b-45f0-afa8-d63b604106e6","Type":"ContainerStarted","Data":"3bb8087152f67f4972ba8a632738dfdf1cc323d30cefd5f55c0ae9176c5946d5"} Dec 16 14:59:58 crc kubenswrapper[4775]: I1216 14:59:58.661318 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" event={"ID":"3b14567a-6ecf-4c94-a972-0ac76733116e","Type":"ContainerStarted","Data":"f82ea5e3b3ce20316f37020f38816057a2835e114ad3360452359bf61bb8a69c"} Dec 16 14:59:59 crc kubenswrapper[4775]: I1216 14:59:59.667607 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" event={"ID":"3b14567a-6ecf-4c94-a972-0ac76733116e","Type":"ContainerStarted","Data":"16fd6250eaf3040b1123085ef0b34df2a73ce6b6de76af294d1218391f61f88c"} Dec 16 14:59:59 crc kubenswrapper[4775]: I1216 14:59:59.668124 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" Dec 16 14:59:59 crc kubenswrapper[4775]: I1216 14:59:59.670605 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" event={"ID":"e75c6912-dc4b-45f0-afa8-d63b604106e6","Type":"ContainerStarted","Data":"a16d5e393f45e624273326ec81689460aab4deb1f5ee83113d438e29b6d74181"} Dec 16 14:59:59 crc kubenswrapper[4775]: I1216 14:59:59.670865 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:59 crc kubenswrapper[4775]: I1216 14:59:59.673578 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" Dec 16 14:59:59 crc kubenswrapper[4775]: I1216 14:59:59.675074 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 14:59:59 crc kubenswrapper[4775]: I1216 14:59:59.690335 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" podStartSLOduration=4.690312116 podStartE2EDuration="4.690312116s" podCreationTimestamp="2025-12-16 14:59:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:59.684762778 +0000 UTC m=+324.635841721" watchObservedRunningTime="2025-12-16 14:59:59.690312116 +0000 UTC m=+324.641391039" Dec 16 15:00:00 crc kubenswrapper[4775]: I1216 15:00:00.157951 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" podStartSLOduration=5.157928666 podStartE2EDuration="5.157928666s" podCreationTimestamp="2025-12-16 14:59:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:59.726115801 +0000 UTC m=+324.677194744" watchObservedRunningTime="2025-12-16 15:00:00.157928666 +0000 UTC m=+325.109007609" Dec 16 15:00:00 crc kubenswrapper[4775]: I1216 15:00:00.160812 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh"] Dec 16 15:00:00 crc kubenswrapper[4775]: I1216 15:00:00.161581 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh" Dec 16 15:00:00 crc kubenswrapper[4775]: I1216 15:00:00.163210 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 15:00:00 crc kubenswrapper[4775]: I1216 15:00:00.167129 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 15:00:00 crc kubenswrapper[4775]: I1216 15:00:00.175208 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh"] Dec 16 15:00:00 crc kubenswrapper[4775]: I1216 15:00:00.197467 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/664d0b17-7c71-45b4-b654-b478ba3737e8-secret-volume\") pod \"collect-profiles-29431620-jx5mh\" (UID: \"664d0b17-7c71-45b4-b654-b478ba3737e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh" Dec 16 15:00:00 crc kubenswrapper[4775]: I1216 15:00:00.197545 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/664d0b17-7c71-45b4-b654-b478ba3737e8-config-volume\") pod \"collect-profiles-29431620-jx5mh\" (UID: \"664d0b17-7c71-45b4-b654-b478ba3737e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh" Dec 16 15:00:00 crc kubenswrapper[4775]: I1216 15:00:00.197640 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vbfn\" (UniqueName: \"kubernetes.io/projected/664d0b17-7c71-45b4-b654-b478ba3737e8-kube-api-access-6vbfn\") pod \"collect-profiles-29431620-jx5mh\" (UID: \"664d0b17-7c71-45b4-b654-b478ba3737e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh" Dec 16 15:00:00 crc kubenswrapper[4775]: I1216 15:00:00.298386 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/664d0b17-7c71-45b4-b654-b478ba3737e8-secret-volume\") pod \"collect-profiles-29431620-jx5mh\" (UID: \"664d0b17-7c71-45b4-b654-b478ba3737e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh" Dec 16 15:00:00 crc kubenswrapper[4775]: I1216 15:00:00.298452 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/664d0b17-7c71-45b4-b654-b478ba3737e8-config-volume\") pod \"collect-profiles-29431620-jx5mh\" (UID: \"664d0b17-7c71-45b4-b654-b478ba3737e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh" Dec 16 15:00:00 crc kubenswrapper[4775]: I1216 15:00:00.298535 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vbfn\" (UniqueName: \"kubernetes.io/projected/664d0b17-7c71-45b4-b654-b478ba3737e8-kube-api-access-6vbfn\") pod \"collect-profiles-29431620-jx5mh\" (UID: \"664d0b17-7c71-45b4-b654-b478ba3737e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh" Dec 16 15:00:00 crc kubenswrapper[4775]: I1216 15:00:00.299441 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/664d0b17-7c71-45b4-b654-b478ba3737e8-config-volume\") pod \"collect-profiles-29431620-jx5mh\" (UID: \"664d0b17-7c71-45b4-b654-b478ba3737e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh" Dec 16 15:00:00 crc kubenswrapper[4775]: I1216 15:00:00.304550 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/664d0b17-7c71-45b4-b654-b478ba3737e8-secret-volume\") pod \"collect-profiles-29431620-jx5mh\" (UID: \"664d0b17-7c71-45b4-b654-b478ba3737e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh" Dec 16 15:00:00 crc kubenswrapper[4775]: I1216 15:00:00.317532 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vbfn\" (UniqueName: \"kubernetes.io/projected/664d0b17-7c71-45b4-b654-b478ba3737e8-kube-api-access-6vbfn\") pod \"collect-profiles-29431620-jx5mh\" (UID: \"664d0b17-7c71-45b4-b654-b478ba3737e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh" Dec 16 15:00:00 crc kubenswrapper[4775]: I1216 15:00:00.483316 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh" Dec 16 15:00:00 crc kubenswrapper[4775]: I1216 15:00:00.883330 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh"] Dec 16 15:00:01 crc kubenswrapper[4775]: I1216 15:00:01.683037 4775 generic.go:334] "Generic (PLEG): container finished" podID="664d0b17-7c71-45b4-b654-b478ba3737e8" containerID="e2da98f295d602378ff1162446963de4a3e2fe1ce51c76a9a1f6c5c9d8b1a3a2" exitCode=0 Dec 16 15:00:01 crc kubenswrapper[4775]: I1216 15:00:01.683157 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh" event={"ID":"664d0b17-7c71-45b4-b654-b478ba3737e8","Type":"ContainerDied","Data":"e2da98f295d602378ff1162446963de4a3e2fe1ce51c76a9a1f6c5c9d8b1a3a2"} Dec 16 15:00:01 crc kubenswrapper[4775]: I1216 15:00:01.683403 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh" event={"ID":"664d0b17-7c71-45b4-b654-b478ba3737e8","Type":"ContainerStarted","Data":"f0454a77136f21f89fda1eaa8d9a7524848b76ab4e71c7b1803c07b8d3f68e8e"} Dec 16 15:00:03 crc kubenswrapper[4775]: I1216 15:00:03.003766 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh" Dec 16 15:00:03 crc kubenswrapper[4775]: I1216 15:00:03.039569 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vbfn\" (UniqueName: \"kubernetes.io/projected/664d0b17-7c71-45b4-b654-b478ba3737e8-kube-api-access-6vbfn\") pod \"664d0b17-7c71-45b4-b654-b478ba3737e8\" (UID: \"664d0b17-7c71-45b4-b654-b478ba3737e8\") " Dec 16 15:00:03 crc kubenswrapper[4775]: I1216 15:00:03.039644 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/664d0b17-7c71-45b4-b654-b478ba3737e8-config-volume\") pod \"664d0b17-7c71-45b4-b654-b478ba3737e8\" (UID: \"664d0b17-7c71-45b4-b654-b478ba3737e8\") " Dec 16 15:00:03 crc kubenswrapper[4775]: I1216 15:00:03.039674 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/664d0b17-7c71-45b4-b654-b478ba3737e8-secret-volume\") pod \"664d0b17-7c71-45b4-b654-b478ba3737e8\" (UID: \"664d0b17-7c71-45b4-b654-b478ba3737e8\") " Dec 16 15:00:03 crc kubenswrapper[4775]: I1216 15:00:03.041478 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/664d0b17-7c71-45b4-b654-b478ba3737e8-config-volume" (OuterVolumeSpecName: "config-volume") pod "664d0b17-7c71-45b4-b654-b478ba3737e8" (UID: "664d0b17-7c71-45b4-b654-b478ba3737e8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:03 crc kubenswrapper[4775]: I1216 15:00:03.046509 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/664d0b17-7c71-45b4-b654-b478ba3737e8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "664d0b17-7c71-45b4-b654-b478ba3737e8" (UID: "664d0b17-7c71-45b4-b654-b478ba3737e8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:00:03 crc kubenswrapper[4775]: I1216 15:00:03.047304 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/664d0b17-7c71-45b4-b654-b478ba3737e8-kube-api-access-6vbfn" (OuterVolumeSpecName: "kube-api-access-6vbfn") pod "664d0b17-7c71-45b4-b654-b478ba3737e8" (UID: "664d0b17-7c71-45b4-b654-b478ba3737e8"). InnerVolumeSpecName "kube-api-access-6vbfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:03 crc kubenswrapper[4775]: I1216 15:00:03.141171 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vbfn\" (UniqueName: \"kubernetes.io/projected/664d0b17-7c71-45b4-b654-b478ba3737e8-kube-api-access-6vbfn\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:03 crc kubenswrapper[4775]: I1216 15:00:03.141210 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/664d0b17-7c71-45b4-b654-b478ba3737e8-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:03 crc kubenswrapper[4775]: I1216 15:00:03.141219 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/664d0b17-7c71-45b4-b654-b478ba3737e8-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:03 crc kubenswrapper[4775]: I1216 15:00:03.697913 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh" event={"ID":"664d0b17-7c71-45b4-b654-b478ba3737e8","Type":"ContainerDied","Data":"f0454a77136f21f89fda1eaa8d9a7524848b76ab4e71c7b1803c07b8d3f68e8e"} Dec 16 15:00:03 crc kubenswrapper[4775]: I1216 15:00:03.697969 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh" Dec 16 15:00:03 crc kubenswrapper[4775]: I1216 15:00:03.698137 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0454a77136f21f89fda1eaa8d9a7524848b76ab4e71c7b1803c07b8d3f68e8e" Dec 16 15:00:04 crc kubenswrapper[4775]: I1216 15:00:04.067606 4775 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 15:00:04 crc kubenswrapper[4775]: I1216 15:00:04.068396 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://9cdda39e91cf4bead8a8025344cd0552633ad50212c2665431be084850260223" gracePeriod=5 Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.674175 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.674977 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.730225 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.730299 4775 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="9cdda39e91cf4bead8a8025344cd0552633ad50212c2665431be084850260223" exitCode=137 Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.730367 4775 scope.go:117] "RemoveContainer" containerID="9cdda39e91cf4bead8a8025344cd0552633ad50212c2665431be084850260223" Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.730398 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.744700 4775 scope.go:117] "RemoveContainer" containerID="9cdda39e91cf4bead8a8025344cd0552633ad50212c2665431be084850260223" Dec 16 15:00:09 crc kubenswrapper[4775]: E1216 15:00:09.745306 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cdda39e91cf4bead8a8025344cd0552633ad50212c2665431be084850260223\": container with ID starting with 9cdda39e91cf4bead8a8025344cd0552633ad50212c2665431be084850260223 not found: ID does not exist" containerID="9cdda39e91cf4bead8a8025344cd0552633ad50212c2665431be084850260223" Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.745378 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cdda39e91cf4bead8a8025344cd0552633ad50212c2665431be084850260223"} err="failed to get container status \"9cdda39e91cf4bead8a8025344cd0552633ad50212c2665431be084850260223\": rpc error: code = NotFound desc = could not find container \"9cdda39e91cf4bead8a8025344cd0552633ad50212c2665431be084850260223\": container with ID starting with 9cdda39e91cf4bead8a8025344cd0552633ad50212c2665431be084850260223 not found: ID does not exist" Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.858725 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.858834 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.858843 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.858916 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.858947 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.858970 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.859262 4775 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.859303 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.859335 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.859533 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.866500 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.960628 4775 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.960673 4775 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.960683 4775 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:09 crc kubenswrapper[4775]: I1216 15:00:09.960691 4775 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:11 crc kubenswrapper[4775]: I1216 15:00:11.344818 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 16 15:00:11 crc kubenswrapper[4775]: I1216 15:00:11.345145 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 16 15:00:11 crc kubenswrapper[4775]: I1216 15:00:11.354158 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 15:00:11 crc kubenswrapper[4775]: I1216 15:00:11.354212 4775 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="01083932-8614-43e7-a382-c9a18ae4b9a4" Dec 16 15:00:11 crc kubenswrapper[4775]: I1216 15:00:11.357327 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 15:00:11 crc kubenswrapper[4775]: I1216 15:00:11.357355 4775 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="01083932-8614-43e7-a382-c9a18ae4b9a4" Dec 16 15:00:11 crc kubenswrapper[4775]: I1216 15:00:11.875277 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 16 15:00:14 crc kubenswrapper[4775]: I1216 15:00:14.796130 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 16 15:00:15 crc kubenswrapper[4775]: I1216 15:00:15.713505 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54b779b799-tn5gf"] Dec 16 15:00:15 crc kubenswrapper[4775]: I1216 15:00:15.713798 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" podUID="e75c6912-dc4b-45f0-afa8-d63b604106e6" containerName="controller-manager" containerID="cri-o://a16d5e393f45e624273326ec81689460aab4deb1f5ee83113d438e29b6d74181" gracePeriod=30 Dec 16 15:00:15 crc kubenswrapper[4775]: I1216 15:00:15.722078 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr"] Dec 16 15:00:15 crc kubenswrapper[4775]: I1216 15:00:15.722364 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" podUID="3b14567a-6ecf-4c94-a972-0ac76733116e" containerName="route-controller-manager" containerID="cri-o://16fd6250eaf3040b1123085ef0b34df2a73ce6b6de76af294d1218391f61f88c" gracePeriod=30 Dec 16 15:00:15 crc kubenswrapper[4775]: I1216 15:00:15.905718 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.775084 4775 generic.go:334] "Generic (PLEG): container finished" podID="e75c6912-dc4b-45f0-afa8-d63b604106e6" containerID="a16d5e393f45e624273326ec81689460aab4deb1f5ee83113d438e29b6d74181" exitCode=0 Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.775193 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" event={"ID":"e75c6912-dc4b-45f0-afa8-d63b604106e6","Type":"ContainerDied","Data":"a16d5e393f45e624273326ec81689460aab4deb1f5ee83113d438e29b6d74181"} Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.777134 4775 generic.go:334] "Generic (PLEG): container finished" podID="3b14567a-6ecf-4c94-a972-0ac76733116e" containerID="16fd6250eaf3040b1123085ef0b34df2a73ce6b6de76af294d1218391f61f88c" exitCode=0 Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.777159 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" event={"ID":"3b14567a-6ecf-4c94-a972-0ac76733116e","Type":"ContainerDied","Data":"16fd6250eaf3040b1123085ef0b34df2a73ce6b6de76af294d1218391f61f88c"} Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.777178 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" event={"ID":"3b14567a-6ecf-4c94-a972-0ac76733116e","Type":"ContainerDied","Data":"f82ea5e3b3ce20316f37020f38816057a2835e114ad3360452359bf61bb8a69c"} Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.777189 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f82ea5e3b3ce20316f37020f38816057a2835e114ad3360452359bf61bb8a69c" Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.786948 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.874372 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr"] Dec 16 15:00:16 crc kubenswrapper[4775]: E1216 15:00:16.874643 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b14567a-6ecf-4c94-a972-0ac76733116e" containerName="route-controller-manager" Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.874659 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b14567a-6ecf-4c94-a972-0ac76733116e" containerName="route-controller-manager" Dec 16 15:00:16 crc kubenswrapper[4775]: E1216 15:00:16.874671 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.874679 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 16 15:00:16 crc kubenswrapper[4775]: E1216 15:00:16.874737 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="664d0b17-7c71-45b4-b654-b478ba3737e8" containerName="collect-profiles" Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.874748 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="664d0b17-7c71-45b4-b654-b478ba3737e8" containerName="collect-profiles" Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.874974 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="664d0b17-7c71-45b4-b654-b478ba3737e8" containerName="collect-profiles" Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.874988 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.875007 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b14567a-6ecf-4c94-a972-0ac76733116e" containerName="route-controller-manager" Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.875768 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.883248 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr"] Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.920411 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.963826 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z6gf\" (UniqueName: \"kubernetes.io/projected/3b14567a-6ecf-4c94-a972-0ac76733116e-kube-api-access-7z6gf\") pod \"3b14567a-6ecf-4c94-a972-0ac76733116e\" (UID: \"3b14567a-6ecf-4c94-a972-0ac76733116e\") " Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.963973 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b14567a-6ecf-4c94-a972-0ac76733116e-config\") pod \"3b14567a-6ecf-4c94-a972-0ac76733116e\" (UID: \"3b14567a-6ecf-4c94-a972-0ac76733116e\") " Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.964031 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b14567a-6ecf-4c94-a972-0ac76733116e-client-ca\") pod \"3b14567a-6ecf-4c94-a972-0ac76733116e\" (UID: \"3b14567a-6ecf-4c94-a972-0ac76733116e\") " Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.964075 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b14567a-6ecf-4c94-a972-0ac76733116e-serving-cert\") pod \"3b14567a-6ecf-4c94-a972-0ac76733116e\" (UID: \"3b14567a-6ecf-4c94-a972-0ac76733116e\") " Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.965552 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b14567a-6ecf-4c94-a972-0ac76733116e-config" (OuterVolumeSpecName: "config") pod "3b14567a-6ecf-4c94-a972-0ac76733116e" (UID: "3b14567a-6ecf-4c94-a972-0ac76733116e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.966879 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b14567a-6ecf-4c94-a972-0ac76733116e-client-ca" (OuterVolumeSpecName: "client-ca") pod "3b14567a-6ecf-4c94-a972-0ac76733116e" (UID: "3b14567a-6ecf-4c94-a972-0ac76733116e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.972454 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b14567a-6ecf-4c94-a972-0ac76733116e-kube-api-access-7z6gf" (OuterVolumeSpecName: "kube-api-access-7z6gf") pod "3b14567a-6ecf-4c94-a972-0ac76733116e" (UID: "3b14567a-6ecf-4c94-a972-0ac76733116e"). InnerVolumeSpecName "kube-api-access-7z6gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:16 crc kubenswrapper[4775]: I1216 15:00:16.972579 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b14567a-6ecf-4c94-a972-0ac76733116e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3b14567a-6ecf-4c94-a972-0ac76733116e" (UID: "3b14567a-6ecf-4c94-a972-0ac76733116e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.065285 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcljg\" (UniqueName: \"kubernetes.io/projected/e75c6912-dc4b-45f0-afa8-d63b604106e6-kube-api-access-qcljg\") pod \"e75c6912-dc4b-45f0-afa8-d63b604106e6\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.065379 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e75c6912-dc4b-45f0-afa8-d63b604106e6-proxy-ca-bundles\") pod \"e75c6912-dc4b-45f0-afa8-d63b604106e6\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.065429 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e75c6912-dc4b-45f0-afa8-d63b604106e6-serving-cert\") pod \"e75c6912-dc4b-45f0-afa8-d63b604106e6\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.065488 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e75c6912-dc4b-45f0-afa8-d63b604106e6-client-ca\") pod \"e75c6912-dc4b-45f0-afa8-d63b604106e6\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.065622 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e75c6912-dc4b-45f0-afa8-d63b604106e6-config\") pod \"e75c6912-dc4b-45f0-afa8-d63b604106e6\" (UID: \"e75c6912-dc4b-45f0-afa8-d63b604106e6\") " Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.066014 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpwpv\" (UniqueName: \"kubernetes.io/projected/42d2575e-21fa-4f68-b90c-8fe3e253d829-kube-api-access-vpwpv\") pod \"route-controller-manager-6d9bc4f474-sdrsr\" (UID: \"42d2575e-21fa-4f68-b90c-8fe3e253d829\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.066105 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42d2575e-21fa-4f68-b90c-8fe3e253d829-client-ca\") pod \"route-controller-manager-6d9bc4f474-sdrsr\" (UID: \"42d2575e-21fa-4f68-b90c-8fe3e253d829\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.067048 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e75c6912-dc4b-45f0-afa8-d63b604106e6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e75c6912-dc4b-45f0-afa8-d63b604106e6" (UID: "e75c6912-dc4b-45f0-afa8-d63b604106e6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.067219 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e75c6912-dc4b-45f0-afa8-d63b604106e6-client-ca" (OuterVolumeSpecName: "client-ca") pod "e75c6912-dc4b-45f0-afa8-d63b604106e6" (UID: "e75c6912-dc4b-45f0-afa8-d63b604106e6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.067386 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e75c6912-dc4b-45f0-afa8-d63b604106e6-config" (OuterVolumeSpecName: "config") pod "e75c6912-dc4b-45f0-afa8-d63b604106e6" (UID: "e75c6912-dc4b-45f0-afa8-d63b604106e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.067459 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42d2575e-21fa-4f68-b90c-8fe3e253d829-serving-cert\") pod \"route-controller-manager-6d9bc4f474-sdrsr\" (UID: \"42d2575e-21fa-4f68-b90c-8fe3e253d829\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.067524 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42d2575e-21fa-4f68-b90c-8fe3e253d829-config\") pod \"route-controller-manager-6d9bc4f474-sdrsr\" (UID: \"42d2575e-21fa-4f68-b90c-8fe3e253d829\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.067606 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b14567a-6ecf-4c94-a972-0ac76733116e-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.067626 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e75c6912-dc4b-45f0-afa8-d63b604106e6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.067645 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e75c6912-dc4b-45f0-afa8-d63b604106e6-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.067664 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b14567a-6ecf-4c94-a972-0ac76733116e-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.067680 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b14567a-6ecf-4c94-a972-0ac76733116e-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.067696 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e75c6912-dc4b-45f0-afa8-d63b604106e6-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.067714 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z6gf\" (UniqueName: \"kubernetes.io/projected/3b14567a-6ecf-4c94-a972-0ac76733116e-kube-api-access-7z6gf\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.070510 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e75c6912-dc4b-45f0-afa8-d63b604106e6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e75c6912-dc4b-45f0-afa8-d63b604106e6" (UID: "e75c6912-dc4b-45f0-afa8-d63b604106e6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.071197 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e75c6912-dc4b-45f0-afa8-d63b604106e6-kube-api-access-qcljg" (OuterVolumeSpecName: "kube-api-access-qcljg") pod "e75c6912-dc4b-45f0-afa8-d63b604106e6" (UID: "e75c6912-dc4b-45f0-afa8-d63b604106e6"). InnerVolumeSpecName "kube-api-access-qcljg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.127178 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.169057 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpwpv\" (UniqueName: \"kubernetes.io/projected/42d2575e-21fa-4f68-b90c-8fe3e253d829-kube-api-access-vpwpv\") pod \"route-controller-manager-6d9bc4f474-sdrsr\" (UID: \"42d2575e-21fa-4f68-b90c-8fe3e253d829\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.169166 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42d2575e-21fa-4f68-b90c-8fe3e253d829-client-ca\") pod \"route-controller-manager-6d9bc4f474-sdrsr\" (UID: \"42d2575e-21fa-4f68-b90c-8fe3e253d829\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.169222 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42d2575e-21fa-4f68-b90c-8fe3e253d829-serving-cert\") pod \"route-controller-manager-6d9bc4f474-sdrsr\" (UID: \"42d2575e-21fa-4f68-b90c-8fe3e253d829\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.169283 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42d2575e-21fa-4f68-b90c-8fe3e253d829-config\") pod \"route-controller-manager-6d9bc4f474-sdrsr\" (UID: \"42d2575e-21fa-4f68-b90c-8fe3e253d829\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.169401 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcljg\" (UniqueName: \"kubernetes.io/projected/e75c6912-dc4b-45f0-afa8-d63b604106e6-kube-api-access-qcljg\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.169425 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e75c6912-dc4b-45f0-afa8-d63b604106e6-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.171269 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42d2575e-21fa-4f68-b90c-8fe3e253d829-client-ca\") pod \"route-controller-manager-6d9bc4f474-sdrsr\" (UID: \"42d2575e-21fa-4f68-b90c-8fe3e253d829\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.171822 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42d2575e-21fa-4f68-b90c-8fe3e253d829-config\") pod \"route-controller-manager-6d9bc4f474-sdrsr\" (UID: \"42d2575e-21fa-4f68-b90c-8fe3e253d829\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.178292 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42d2575e-21fa-4f68-b90c-8fe3e253d829-serving-cert\") pod \"route-controller-manager-6d9bc4f474-sdrsr\" (UID: \"42d2575e-21fa-4f68-b90c-8fe3e253d829\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.201487 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpwpv\" (UniqueName: \"kubernetes.io/projected/42d2575e-21fa-4f68-b90c-8fe3e253d829-kube-api-access-vpwpv\") pod \"route-controller-manager-6d9bc4f474-sdrsr\" (UID: \"42d2575e-21fa-4f68-b90c-8fe3e253d829\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.240308 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.703490 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr"] Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.787340 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" event={"ID":"e75c6912-dc4b-45f0-afa8-d63b604106e6","Type":"ContainerDied","Data":"3bb8087152f67f4972ba8a632738dfdf1cc323d30cefd5f55c0ae9176c5946d5"} Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.787422 4775 scope.go:117] "RemoveContainer" containerID="a16d5e393f45e624273326ec81689460aab4deb1f5ee83113d438e29b6d74181" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.787580 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54b779b799-tn5gf" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.793020 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr" Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.793535 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" event={"ID":"42d2575e-21fa-4f68-b90c-8fe3e253d829","Type":"ContainerStarted","Data":"ca394adae0b9c5ffad813b74610edd254be3a2e0b5097555d33f1035ec51dcea"} Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.827679 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54b779b799-tn5gf"] Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.831711 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-54b779b799-tn5gf"] Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.834956 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr"] Dec 16 15:00:17 crc kubenswrapper[4775]: I1216 15:00:17.837678 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6485597c9c-tc4sr"] Dec 16 15:00:18 crc kubenswrapper[4775]: I1216 15:00:18.263985 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 16 15:00:18 crc kubenswrapper[4775]: I1216 15:00:18.448183 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 16 15:00:18 crc kubenswrapper[4775]: I1216 15:00:18.802033 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" event={"ID":"42d2575e-21fa-4f68-b90c-8fe3e253d829","Type":"ContainerStarted","Data":"50f891315579003e26a8dca52e44c508c86880d592ddd1262d07b6a92851d6ae"} Dec 16 15:00:18 crc kubenswrapper[4775]: I1216 15:00:18.802559 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" Dec 16 15:00:18 crc kubenswrapper[4775]: I1216 15:00:18.810269 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" Dec 16 15:00:18 crc kubenswrapper[4775]: I1216 15:00:18.821880 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" podStartSLOduration=3.8218598679999998 podStartE2EDuration="3.821859868s" podCreationTimestamp="2025-12-16 15:00:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:00:18.819656628 +0000 UTC m=+343.770735581" watchObservedRunningTime="2025-12-16 15:00:18.821859868 +0000 UTC m=+343.772938791" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.320878 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7cc945dc9b-p5snx"] Dec 16 15:00:19 crc kubenswrapper[4775]: E1216 15:00:19.321963 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e75c6912-dc4b-45f0-afa8-d63b604106e6" containerName="controller-manager" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.321980 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e75c6912-dc4b-45f0-afa8-d63b604106e6" containerName="controller-manager" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.322088 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e75c6912-dc4b-45f0-afa8-d63b604106e6" containerName="controller-manager" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.322525 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.324811 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.325724 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.325849 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.325930 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.325932 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.325998 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.342192 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.347406 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b14567a-6ecf-4c94-a972-0ac76733116e" path="/var/lib/kubelet/pods/3b14567a-6ecf-4c94-a972-0ac76733116e/volumes" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.348077 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e75c6912-dc4b-45f0-afa8-d63b604106e6" path="/var/lib/kubelet/pods/e75c6912-dc4b-45f0-afa8-d63b604106e6/volumes" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.348777 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cc945dc9b-p5snx"] Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.505192 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/435c7264-633c-4a27-ad85-777767cbb1c3-serving-cert\") pod \"controller-manager-7cc945dc9b-p5snx\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.505526 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/435c7264-633c-4a27-ad85-777767cbb1c3-config\") pod \"controller-manager-7cc945dc9b-p5snx\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.505563 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/435c7264-633c-4a27-ad85-777767cbb1c3-proxy-ca-bundles\") pod \"controller-manager-7cc945dc9b-p5snx\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.505588 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htxjq\" (UniqueName: \"kubernetes.io/projected/435c7264-633c-4a27-ad85-777767cbb1c3-kube-api-access-htxjq\") pod \"controller-manager-7cc945dc9b-p5snx\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.505608 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/435c7264-633c-4a27-ad85-777767cbb1c3-client-ca\") pod \"controller-manager-7cc945dc9b-p5snx\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.607640 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/435c7264-633c-4a27-ad85-777767cbb1c3-serving-cert\") pod \"controller-manager-7cc945dc9b-p5snx\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.607712 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/435c7264-633c-4a27-ad85-777767cbb1c3-config\") pod \"controller-manager-7cc945dc9b-p5snx\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.607785 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/435c7264-633c-4a27-ad85-777767cbb1c3-proxy-ca-bundles\") pod \"controller-manager-7cc945dc9b-p5snx\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.607871 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htxjq\" (UniqueName: \"kubernetes.io/projected/435c7264-633c-4a27-ad85-777767cbb1c3-kube-api-access-htxjq\") pod \"controller-manager-7cc945dc9b-p5snx\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.607967 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/435c7264-633c-4a27-ad85-777767cbb1c3-client-ca\") pod \"controller-manager-7cc945dc9b-p5snx\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.609447 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/435c7264-633c-4a27-ad85-777767cbb1c3-proxy-ca-bundles\") pod \"controller-manager-7cc945dc9b-p5snx\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.610125 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/435c7264-633c-4a27-ad85-777767cbb1c3-client-ca\") pod \"controller-manager-7cc945dc9b-p5snx\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.610742 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/435c7264-633c-4a27-ad85-777767cbb1c3-config\") pod \"controller-manager-7cc945dc9b-p5snx\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.621094 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/435c7264-633c-4a27-ad85-777767cbb1c3-serving-cert\") pod \"controller-manager-7cc945dc9b-p5snx\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.639314 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htxjq\" (UniqueName: \"kubernetes.io/projected/435c7264-633c-4a27-ad85-777767cbb1c3-kube-api-access-htxjq\") pod \"controller-manager-7cc945dc9b-p5snx\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:19 crc kubenswrapper[4775]: I1216 15:00:19.939524 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:20 crc kubenswrapper[4775]: I1216 15:00:20.395800 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cc945dc9b-p5snx"] Dec 16 15:00:20 crc kubenswrapper[4775]: I1216 15:00:20.815983 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" event={"ID":"435c7264-633c-4a27-ad85-777767cbb1c3","Type":"ContainerStarted","Data":"c0fe81446d8631b8ce09f36d1c7e480089c8985206cedcaf7498acd39609b195"} Dec 16 15:00:20 crc kubenswrapper[4775]: I1216 15:00:20.816062 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" event={"ID":"435c7264-633c-4a27-ad85-777767cbb1c3","Type":"ContainerStarted","Data":"0e5efe676c6298a266e2fedd052352481d1a7d0ea1a89d675d482f09375c15cb"} Dec 16 15:00:20 crc kubenswrapper[4775]: I1216 15:00:20.836095 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" podStartSLOduration=5.8360758520000005 podStartE2EDuration="5.836075852s" podCreationTimestamp="2025-12-16 15:00:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:00:20.833915562 +0000 UTC m=+345.784994495" watchObservedRunningTime="2025-12-16 15:00:20.836075852 +0000 UTC m=+345.787154775" Dec 16 15:00:21 crc kubenswrapper[4775]: I1216 15:00:21.822416 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:21 crc kubenswrapper[4775]: I1216 15:00:21.826732 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:00:32 crc kubenswrapper[4775]: I1216 15:00:32.869637 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:00:32 crc kubenswrapper[4775]: I1216 15:00:32.870397 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:00:55 crc kubenswrapper[4775]: I1216 15:00:55.695946 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr"] Dec 16 15:00:55 crc kubenswrapper[4775]: I1216 15:00:55.696776 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" podUID="42d2575e-21fa-4f68-b90c-8fe3e253d829" containerName="route-controller-manager" containerID="cri-o://50f891315579003e26a8dca52e44c508c86880d592ddd1262d07b6a92851d6ae" gracePeriod=30 Dec 16 15:00:56 crc kubenswrapper[4775]: I1216 15:00:56.034489 4775 generic.go:334] "Generic (PLEG): container finished" podID="42d2575e-21fa-4f68-b90c-8fe3e253d829" containerID="50f891315579003e26a8dca52e44c508c86880d592ddd1262d07b6a92851d6ae" exitCode=0 Dec 16 15:00:56 crc kubenswrapper[4775]: I1216 15:00:56.034542 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" event={"ID":"42d2575e-21fa-4f68-b90c-8fe3e253d829","Type":"ContainerDied","Data":"50f891315579003e26a8dca52e44c508c86880d592ddd1262d07b6a92851d6ae"} Dec 16 15:00:56 crc kubenswrapper[4775]: I1216 15:00:56.685860 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" Dec 16 15:00:56 crc kubenswrapper[4775]: I1216 15:00:56.809000 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42d2575e-21fa-4f68-b90c-8fe3e253d829-client-ca\") pod \"42d2575e-21fa-4f68-b90c-8fe3e253d829\" (UID: \"42d2575e-21fa-4f68-b90c-8fe3e253d829\") " Dec 16 15:00:56 crc kubenswrapper[4775]: I1216 15:00:56.809119 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpwpv\" (UniqueName: \"kubernetes.io/projected/42d2575e-21fa-4f68-b90c-8fe3e253d829-kube-api-access-vpwpv\") pod \"42d2575e-21fa-4f68-b90c-8fe3e253d829\" (UID: \"42d2575e-21fa-4f68-b90c-8fe3e253d829\") " Dec 16 15:00:56 crc kubenswrapper[4775]: I1216 15:00:56.809198 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42d2575e-21fa-4f68-b90c-8fe3e253d829-serving-cert\") pod \"42d2575e-21fa-4f68-b90c-8fe3e253d829\" (UID: \"42d2575e-21fa-4f68-b90c-8fe3e253d829\") " Dec 16 15:00:56 crc kubenswrapper[4775]: I1216 15:00:56.809386 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42d2575e-21fa-4f68-b90c-8fe3e253d829-config\") pod \"42d2575e-21fa-4f68-b90c-8fe3e253d829\" (UID: \"42d2575e-21fa-4f68-b90c-8fe3e253d829\") " Dec 16 15:00:56 crc kubenswrapper[4775]: I1216 15:00:56.810444 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42d2575e-21fa-4f68-b90c-8fe3e253d829-client-ca" (OuterVolumeSpecName: "client-ca") pod "42d2575e-21fa-4f68-b90c-8fe3e253d829" (UID: "42d2575e-21fa-4f68-b90c-8fe3e253d829"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:56 crc kubenswrapper[4775]: I1216 15:00:56.811315 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42d2575e-21fa-4f68-b90c-8fe3e253d829-config" (OuterVolumeSpecName: "config") pod "42d2575e-21fa-4f68-b90c-8fe3e253d829" (UID: "42d2575e-21fa-4f68-b90c-8fe3e253d829"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:56 crc kubenswrapper[4775]: I1216 15:00:56.816018 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d2575e-21fa-4f68-b90c-8fe3e253d829-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "42d2575e-21fa-4f68-b90c-8fe3e253d829" (UID: "42d2575e-21fa-4f68-b90c-8fe3e253d829"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:00:56 crc kubenswrapper[4775]: I1216 15:00:56.816768 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42d2575e-21fa-4f68-b90c-8fe3e253d829-kube-api-access-vpwpv" (OuterVolumeSpecName: "kube-api-access-vpwpv") pod "42d2575e-21fa-4f68-b90c-8fe3e253d829" (UID: "42d2575e-21fa-4f68-b90c-8fe3e253d829"). InnerVolumeSpecName "kube-api-access-vpwpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:56 crc kubenswrapper[4775]: I1216 15:00:56.911053 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42d2575e-21fa-4f68-b90c-8fe3e253d829-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:56 crc kubenswrapper[4775]: I1216 15:00:56.911110 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpwpv\" (UniqueName: \"kubernetes.io/projected/42d2575e-21fa-4f68-b90c-8fe3e253d829-kube-api-access-vpwpv\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:56 crc kubenswrapper[4775]: I1216 15:00:56.911134 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42d2575e-21fa-4f68-b90c-8fe3e253d829-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:56 crc kubenswrapper[4775]: I1216 15:00:56.911152 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42d2575e-21fa-4f68-b90c-8fe3e253d829-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.040683 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" event={"ID":"42d2575e-21fa-4f68-b90c-8fe3e253d829","Type":"ContainerDied","Data":"ca394adae0b9c5ffad813b74610edd254be3a2e0b5097555d33f1035ec51dcea"} Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.040739 4775 scope.go:117] "RemoveContainer" containerID="50f891315579003e26a8dca52e44c508c86880d592ddd1262d07b6a92851d6ae" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.040867 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.070158 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr"] Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.073181 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9bc4f474-sdrsr"] Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.354730 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42d2575e-21fa-4f68-b90c-8fe3e253d829" path="/var/lib/kubelet/pods/42d2575e-21fa-4f68-b90c-8fe3e253d829/volumes" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.355270 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd"] Dec 16 15:00:57 crc kubenswrapper[4775]: E1216 15:00:57.355490 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d2575e-21fa-4f68-b90c-8fe3e253d829" containerName="route-controller-manager" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.355504 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d2575e-21fa-4f68-b90c-8fe3e253d829" containerName="route-controller-manager" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.355635 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d2575e-21fa-4f68-b90c-8fe3e253d829" containerName="route-controller-manager" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.356089 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.358862 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.358953 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd"] Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.360359 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.360589 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.361084 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.361307 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.361461 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.418063 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80e6ce68-676c-4732-83cf-07f241f63816-client-ca\") pod \"route-controller-manager-6485597c9c-fhwsd\" (UID: \"80e6ce68-676c-4732-83cf-07f241f63816\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.418113 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmckl\" (UniqueName: \"kubernetes.io/projected/80e6ce68-676c-4732-83cf-07f241f63816-kube-api-access-pmckl\") pod \"route-controller-manager-6485597c9c-fhwsd\" (UID: \"80e6ce68-676c-4732-83cf-07f241f63816\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.418154 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80e6ce68-676c-4732-83cf-07f241f63816-config\") pod \"route-controller-manager-6485597c9c-fhwsd\" (UID: \"80e6ce68-676c-4732-83cf-07f241f63816\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.418197 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80e6ce68-676c-4732-83cf-07f241f63816-serving-cert\") pod \"route-controller-manager-6485597c9c-fhwsd\" (UID: \"80e6ce68-676c-4732-83cf-07f241f63816\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.519418 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80e6ce68-676c-4732-83cf-07f241f63816-serving-cert\") pod \"route-controller-manager-6485597c9c-fhwsd\" (UID: \"80e6ce68-676c-4732-83cf-07f241f63816\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.519597 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80e6ce68-676c-4732-83cf-07f241f63816-client-ca\") pod \"route-controller-manager-6485597c9c-fhwsd\" (UID: \"80e6ce68-676c-4732-83cf-07f241f63816\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.519674 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmckl\" (UniqueName: \"kubernetes.io/projected/80e6ce68-676c-4732-83cf-07f241f63816-kube-api-access-pmckl\") pod \"route-controller-manager-6485597c9c-fhwsd\" (UID: \"80e6ce68-676c-4732-83cf-07f241f63816\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.519801 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80e6ce68-676c-4732-83cf-07f241f63816-config\") pod \"route-controller-manager-6485597c9c-fhwsd\" (UID: \"80e6ce68-676c-4732-83cf-07f241f63816\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.521801 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80e6ce68-676c-4732-83cf-07f241f63816-client-ca\") pod \"route-controller-manager-6485597c9c-fhwsd\" (UID: \"80e6ce68-676c-4732-83cf-07f241f63816\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.522332 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80e6ce68-676c-4732-83cf-07f241f63816-config\") pod \"route-controller-manager-6485597c9c-fhwsd\" (UID: \"80e6ce68-676c-4732-83cf-07f241f63816\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.525937 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80e6ce68-676c-4732-83cf-07f241f63816-serving-cert\") pod \"route-controller-manager-6485597c9c-fhwsd\" (UID: \"80e6ce68-676c-4732-83cf-07f241f63816\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.549494 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmckl\" (UniqueName: \"kubernetes.io/projected/80e6ce68-676c-4732-83cf-07f241f63816-kube-api-access-pmckl\") pod \"route-controller-manager-6485597c9c-fhwsd\" (UID: \"80e6ce68-676c-4732-83cf-07f241f63816\") " pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.681588 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" Dec 16 15:00:57 crc kubenswrapper[4775]: I1216 15:00:57.926718 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd"] Dec 16 15:00:58 crc kubenswrapper[4775]: I1216 15:00:58.051501 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" event={"ID":"80e6ce68-676c-4732-83cf-07f241f63816","Type":"ContainerStarted","Data":"3cad9f5ead05bfab94f6a1323232e5357609f1d550ebde6281facdd9b39c1280"} Dec 16 15:00:59 crc kubenswrapper[4775]: I1216 15:00:59.058911 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" event={"ID":"80e6ce68-676c-4732-83cf-07f241f63816","Type":"ContainerStarted","Data":"50d3fee85ca6f0a2674bb05efc998b6a946e14a9a8d4309a644c9dbd437a883c"} Dec 16 15:00:59 crc kubenswrapper[4775]: I1216 15:00:59.059297 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" Dec 16 15:00:59 crc kubenswrapper[4775]: I1216 15:00:59.067334 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" Dec 16 15:00:59 crc kubenswrapper[4775]: I1216 15:00:59.096173 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6485597c9c-fhwsd" podStartSLOduration=4.09613729 podStartE2EDuration="4.09613729s" podCreationTimestamp="2025-12-16 15:00:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:00:59.077844947 +0000 UTC m=+384.028923900" watchObservedRunningTime="2025-12-16 15:00:59.09613729 +0000 UTC m=+384.047216233" Dec 16 15:01:02 crc kubenswrapper[4775]: I1216 15:01:02.869380 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:01:02 crc kubenswrapper[4775]: I1216 15:01:02.869752 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:01:15 crc kubenswrapper[4775]: I1216 15:01:15.687834 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cc945dc9b-p5snx"] Dec 16 15:01:15 crc kubenswrapper[4775]: I1216 15:01:15.688557 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" podUID="435c7264-633c-4a27-ad85-777767cbb1c3" containerName="controller-manager" containerID="cri-o://c0fe81446d8631b8ce09f36d1c7e480089c8985206cedcaf7498acd39609b195" gracePeriod=30 Dec 16 15:01:16 crc kubenswrapper[4775]: I1216 15:01:16.153156 4775 generic.go:334] "Generic (PLEG): container finished" podID="435c7264-633c-4a27-ad85-777767cbb1c3" containerID="c0fe81446d8631b8ce09f36d1c7e480089c8985206cedcaf7498acd39609b195" exitCode=0 Dec 16 15:01:16 crc kubenswrapper[4775]: I1216 15:01:16.153294 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" event={"ID":"435c7264-633c-4a27-ad85-777767cbb1c3","Type":"ContainerDied","Data":"c0fe81446d8631b8ce09f36d1c7e480089c8985206cedcaf7498acd39609b195"} Dec 16 15:01:16 crc kubenswrapper[4775]: I1216 15:01:16.603776 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:01:16 crc kubenswrapper[4775]: I1216 15:01:16.745778 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/435c7264-633c-4a27-ad85-777767cbb1c3-proxy-ca-bundles\") pod \"435c7264-633c-4a27-ad85-777767cbb1c3\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " Dec 16 15:01:16 crc kubenswrapper[4775]: I1216 15:01:16.745877 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htxjq\" (UniqueName: \"kubernetes.io/projected/435c7264-633c-4a27-ad85-777767cbb1c3-kube-api-access-htxjq\") pod \"435c7264-633c-4a27-ad85-777767cbb1c3\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " Dec 16 15:01:16 crc kubenswrapper[4775]: I1216 15:01:16.745989 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/435c7264-633c-4a27-ad85-777767cbb1c3-client-ca\") pod \"435c7264-633c-4a27-ad85-777767cbb1c3\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " Dec 16 15:01:16 crc kubenswrapper[4775]: I1216 15:01:16.746028 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/435c7264-633c-4a27-ad85-777767cbb1c3-config\") pod \"435c7264-633c-4a27-ad85-777767cbb1c3\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " Dec 16 15:01:16 crc kubenswrapper[4775]: I1216 15:01:16.746053 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/435c7264-633c-4a27-ad85-777767cbb1c3-serving-cert\") pod \"435c7264-633c-4a27-ad85-777767cbb1c3\" (UID: \"435c7264-633c-4a27-ad85-777767cbb1c3\") " Dec 16 15:01:16 crc kubenswrapper[4775]: I1216 15:01:16.746588 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/435c7264-633c-4a27-ad85-777767cbb1c3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "435c7264-633c-4a27-ad85-777767cbb1c3" (UID: "435c7264-633c-4a27-ad85-777767cbb1c3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:01:16 crc kubenswrapper[4775]: I1216 15:01:16.746609 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/435c7264-633c-4a27-ad85-777767cbb1c3-client-ca" (OuterVolumeSpecName: "client-ca") pod "435c7264-633c-4a27-ad85-777767cbb1c3" (UID: "435c7264-633c-4a27-ad85-777767cbb1c3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:01:16 crc kubenswrapper[4775]: I1216 15:01:16.746722 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/435c7264-633c-4a27-ad85-777767cbb1c3-config" (OuterVolumeSpecName: "config") pod "435c7264-633c-4a27-ad85-777767cbb1c3" (UID: "435c7264-633c-4a27-ad85-777767cbb1c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:01:16 crc kubenswrapper[4775]: I1216 15:01:16.751285 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/435c7264-633c-4a27-ad85-777767cbb1c3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "435c7264-633c-4a27-ad85-777767cbb1c3" (UID: "435c7264-633c-4a27-ad85-777767cbb1c3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:01:16 crc kubenswrapper[4775]: I1216 15:01:16.751757 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/435c7264-633c-4a27-ad85-777767cbb1c3-kube-api-access-htxjq" (OuterVolumeSpecName: "kube-api-access-htxjq") pod "435c7264-633c-4a27-ad85-777767cbb1c3" (UID: "435c7264-633c-4a27-ad85-777767cbb1c3"). InnerVolumeSpecName "kube-api-access-htxjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:01:16 crc kubenswrapper[4775]: I1216 15:01:16.847988 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htxjq\" (UniqueName: \"kubernetes.io/projected/435c7264-633c-4a27-ad85-777767cbb1c3-kube-api-access-htxjq\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:16 crc kubenswrapper[4775]: I1216 15:01:16.848039 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/435c7264-633c-4a27-ad85-777767cbb1c3-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:16 crc kubenswrapper[4775]: I1216 15:01:16.848054 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/435c7264-633c-4a27-ad85-777767cbb1c3-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:16 crc kubenswrapper[4775]: I1216 15:01:16.848068 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/435c7264-633c-4a27-ad85-777767cbb1c3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:16 crc kubenswrapper[4775]: I1216 15:01:16.848080 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/435c7264-633c-4a27-ad85-777767cbb1c3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.163347 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" event={"ID":"435c7264-633c-4a27-ad85-777767cbb1c3","Type":"ContainerDied","Data":"0e5efe676c6298a266e2fedd052352481d1a7d0ea1a89d675d482f09375c15cb"} Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.163434 4775 scope.go:117] "RemoveContainer" containerID="c0fe81446d8631b8ce09f36d1c7e480089c8985206cedcaf7498acd39609b195" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.163474 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc945dc9b-p5snx" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.224175 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cc945dc9b-p5snx"] Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.242262 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7cc945dc9b-p5snx"] Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.350768 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="435c7264-633c-4a27-ad85-777767cbb1c3" path="/var/lib/kubelet/pods/435c7264-633c-4a27-ad85-777767cbb1c3/volumes" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.359784 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54b779b799-c8c65"] Dec 16 15:01:17 crc kubenswrapper[4775]: E1216 15:01:17.360195 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="435c7264-633c-4a27-ad85-777767cbb1c3" containerName="controller-manager" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.360233 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="435c7264-633c-4a27-ad85-777767cbb1c3" containerName="controller-manager" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.360400 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="435c7264-633c-4a27-ad85-777767cbb1c3" containerName="controller-manager" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.361094 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.363906 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.365138 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.368307 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.368555 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.368718 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.368941 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.369215 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54b779b799-c8c65"] Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.377510 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.456724 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52eea6c9-0475-413a-bce7-e3d15f5e256e-proxy-ca-bundles\") pod \"controller-manager-54b779b799-c8c65\" (UID: \"52eea6c9-0475-413a-bce7-e3d15f5e256e\") " pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.456952 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52eea6c9-0475-413a-bce7-e3d15f5e256e-config\") pod \"controller-manager-54b779b799-c8c65\" (UID: \"52eea6c9-0475-413a-bce7-e3d15f5e256e\") " pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.456995 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rl24\" (UniqueName: \"kubernetes.io/projected/52eea6c9-0475-413a-bce7-e3d15f5e256e-kube-api-access-5rl24\") pod \"controller-manager-54b779b799-c8c65\" (UID: \"52eea6c9-0475-413a-bce7-e3d15f5e256e\") " pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.457145 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52eea6c9-0475-413a-bce7-e3d15f5e256e-client-ca\") pod \"controller-manager-54b779b799-c8c65\" (UID: \"52eea6c9-0475-413a-bce7-e3d15f5e256e\") " pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.458160 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52eea6c9-0475-413a-bce7-e3d15f5e256e-serving-cert\") pod \"controller-manager-54b779b799-c8c65\" (UID: \"52eea6c9-0475-413a-bce7-e3d15f5e256e\") " pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.559579 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52eea6c9-0475-413a-bce7-e3d15f5e256e-serving-cert\") pod \"controller-manager-54b779b799-c8c65\" (UID: \"52eea6c9-0475-413a-bce7-e3d15f5e256e\") " pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.559666 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52eea6c9-0475-413a-bce7-e3d15f5e256e-proxy-ca-bundles\") pod \"controller-manager-54b779b799-c8c65\" (UID: \"52eea6c9-0475-413a-bce7-e3d15f5e256e\") " pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.559756 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52eea6c9-0475-413a-bce7-e3d15f5e256e-config\") pod \"controller-manager-54b779b799-c8c65\" (UID: \"52eea6c9-0475-413a-bce7-e3d15f5e256e\") " pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.559800 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rl24\" (UniqueName: \"kubernetes.io/projected/52eea6c9-0475-413a-bce7-e3d15f5e256e-kube-api-access-5rl24\") pod \"controller-manager-54b779b799-c8c65\" (UID: \"52eea6c9-0475-413a-bce7-e3d15f5e256e\") " pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.559874 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52eea6c9-0475-413a-bce7-e3d15f5e256e-client-ca\") pod \"controller-manager-54b779b799-c8c65\" (UID: \"52eea6c9-0475-413a-bce7-e3d15f5e256e\") " pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.561267 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52eea6c9-0475-413a-bce7-e3d15f5e256e-proxy-ca-bundles\") pod \"controller-manager-54b779b799-c8c65\" (UID: \"52eea6c9-0475-413a-bce7-e3d15f5e256e\") " pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.562414 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52eea6c9-0475-413a-bce7-e3d15f5e256e-config\") pod \"controller-manager-54b779b799-c8c65\" (UID: \"52eea6c9-0475-413a-bce7-e3d15f5e256e\") " pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.564738 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52eea6c9-0475-413a-bce7-e3d15f5e256e-client-ca\") pod \"controller-manager-54b779b799-c8c65\" (UID: \"52eea6c9-0475-413a-bce7-e3d15f5e256e\") " pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.565639 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52eea6c9-0475-413a-bce7-e3d15f5e256e-serving-cert\") pod \"controller-manager-54b779b799-c8c65\" (UID: \"52eea6c9-0475-413a-bce7-e3d15f5e256e\") " pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.588156 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rl24\" (UniqueName: \"kubernetes.io/projected/52eea6c9-0475-413a-bce7-e3d15f5e256e-kube-api-access-5rl24\") pod \"controller-manager-54b779b799-c8c65\" (UID: \"52eea6c9-0475-413a-bce7-e3d15f5e256e\") " pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.681424 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:17 crc kubenswrapper[4775]: I1216 15:01:17.957060 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54b779b799-c8c65"] Dec 16 15:01:18 crc kubenswrapper[4775]: I1216 15:01:18.170270 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" event={"ID":"52eea6c9-0475-413a-bce7-e3d15f5e256e","Type":"ContainerStarted","Data":"e24626be556b8cff5db0382073d4b970af5f0cb4a399193d88281354ed96d852"} Dec 16 15:01:18 crc kubenswrapper[4775]: I1216 15:01:18.170635 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" event={"ID":"52eea6c9-0475-413a-bce7-e3d15f5e256e","Type":"ContainerStarted","Data":"98fe41000d354ddce7b39f29452c0968176a7ea48388e9499147f8c7ba818f3e"} Dec 16 15:01:18 crc kubenswrapper[4775]: I1216 15:01:18.171294 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:18 crc kubenswrapper[4775]: I1216 15:01:18.172868 4775 patch_prober.go:28] interesting pod/controller-manager-54b779b799-c8c65 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" start-of-body= Dec 16 15:01:18 crc kubenswrapper[4775]: I1216 15:01:18.172932 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" podUID="52eea6c9-0475-413a-bce7-e3d15f5e256e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" Dec 16 15:01:18 crc kubenswrapper[4775]: I1216 15:01:18.807488 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" podStartSLOduration=3.807464887 podStartE2EDuration="3.807464887s" podCreationTimestamp="2025-12-16 15:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:01:18.19454816 +0000 UTC m=+403.145627083" watchObservedRunningTime="2025-12-16 15:01:18.807464887 +0000 UTC m=+403.758543810" Dec 16 15:01:18 crc kubenswrapper[4775]: I1216 15:01:18.810465 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7r67s"] Dec 16 15:01:18 crc kubenswrapper[4775]: I1216 15:01:18.811346 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:18 crc kubenswrapper[4775]: I1216 15:01:18.834499 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7r67s"] Dec 16 15:01:18 crc kubenswrapper[4775]: I1216 15:01:18.978765 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzc8c\" (UniqueName: \"kubernetes.io/projected/77db571b-7b1f-42c0-aed2-46391a9ce190-kube-api-access-wzc8c\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:18 crc kubenswrapper[4775]: I1216 15:01:18.980105 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77db571b-7b1f-42c0-aed2-46391a9ce190-bound-sa-token\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:18 crc kubenswrapper[4775]: I1216 15:01:18.980201 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/77db571b-7b1f-42c0-aed2-46391a9ce190-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:18 crc kubenswrapper[4775]: I1216 15:01:18.980295 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/77db571b-7b1f-42c0-aed2-46391a9ce190-registry-certificates\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:18 crc kubenswrapper[4775]: I1216 15:01:18.980413 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77db571b-7b1f-42c0-aed2-46391a9ce190-trusted-ca\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:18 crc kubenswrapper[4775]: I1216 15:01:18.980504 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/77db571b-7b1f-42c0-aed2-46391a9ce190-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:18 crc kubenswrapper[4775]: I1216 15:01:18.980628 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:18 crc kubenswrapper[4775]: I1216 15:01:18.980672 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77db571b-7b1f-42c0-aed2-46391a9ce190-registry-tls\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:19 crc kubenswrapper[4775]: I1216 15:01:19.012166 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:19 crc kubenswrapper[4775]: I1216 15:01:19.081727 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/77db571b-7b1f-42c0-aed2-46391a9ce190-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:19 crc kubenswrapper[4775]: I1216 15:01:19.081793 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77db571b-7b1f-42c0-aed2-46391a9ce190-registry-tls\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:19 crc kubenswrapper[4775]: I1216 15:01:19.081841 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzc8c\" (UniqueName: \"kubernetes.io/projected/77db571b-7b1f-42c0-aed2-46391a9ce190-kube-api-access-wzc8c\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:19 crc kubenswrapper[4775]: I1216 15:01:19.081858 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77db571b-7b1f-42c0-aed2-46391a9ce190-bound-sa-token\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:19 crc kubenswrapper[4775]: I1216 15:01:19.081899 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/77db571b-7b1f-42c0-aed2-46391a9ce190-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:19 crc kubenswrapper[4775]: I1216 15:01:19.081917 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/77db571b-7b1f-42c0-aed2-46391a9ce190-registry-certificates\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:19 crc kubenswrapper[4775]: I1216 15:01:19.081951 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77db571b-7b1f-42c0-aed2-46391a9ce190-trusted-ca\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:19 crc kubenswrapper[4775]: I1216 15:01:19.083338 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77db571b-7b1f-42c0-aed2-46391a9ce190-trusted-ca\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:19 crc kubenswrapper[4775]: I1216 15:01:19.083610 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/77db571b-7b1f-42c0-aed2-46391a9ce190-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:19 crc kubenswrapper[4775]: I1216 15:01:19.084167 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/77db571b-7b1f-42c0-aed2-46391a9ce190-registry-certificates\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:19 crc kubenswrapper[4775]: I1216 15:01:19.096866 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/77db571b-7b1f-42c0-aed2-46391a9ce190-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:19 crc kubenswrapper[4775]: I1216 15:01:19.098059 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77db571b-7b1f-42c0-aed2-46391a9ce190-registry-tls\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:19 crc kubenswrapper[4775]: I1216 15:01:19.110509 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77db571b-7b1f-42c0-aed2-46391a9ce190-bound-sa-token\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:19 crc kubenswrapper[4775]: I1216 15:01:19.112025 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzc8c\" (UniqueName: \"kubernetes.io/projected/77db571b-7b1f-42c0-aed2-46391a9ce190-kube-api-access-wzc8c\") pod \"image-registry-66df7c8f76-7r67s\" (UID: \"77db571b-7b1f-42c0-aed2-46391a9ce190\") " pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:19 crc kubenswrapper[4775]: I1216 15:01:19.149652 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:19 crc kubenswrapper[4775]: I1216 15:01:19.178797 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54b779b799-c8c65" Dec 16 15:01:19 crc kubenswrapper[4775]: I1216 15:01:19.648650 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7r67s"] Dec 16 15:01:20 crc kubenswrapper[4775]: I1216 15:01:20.181414 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" event={"ID":"77db571b-7b1f-42c0-aed2-46391a9ce190","Type":"ContainerStarted","Data":"247fe8ea35a3d98b550ae28a0797eab87c6bf03db37511c82c045f732f3b03e3"} Dec 16 15:01:20 crc kubenswrapper[4775]: I1216 15:01:20.182057 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" event={"ID":"77db571b-7b1f-42c0-aed2-46391a9ce190","Type":"ContainerStarted","Data":"3fb3e0124b2c202524ba8f57316b23345aa71636b84290fdae24adb09487891a"} Dec 16 15:01:20 crc kubenswrapper[4775]: I1216 15:01:20.200773 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" podStartSLOduration=2.200752606 podStartE2EDuration="2.200752606s" podCreationTimestamp="2025-12-16 15:01:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:01:20.198476323 +0000 UTC m=+405.149555266" watchObservedRunningTime="2025-12-16 15:01:20.200752606 +0000 UTC m=+405.151831549" Dec 16 15:01:21 crc kubenswrapper[4775]: I1216 15:01:21.187934 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.232665 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dh2bb"] Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.233982 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dh2bb" podUID="ae4804bb-2669-48fc-aa42-3e4f1c94323b" containerName="registry-server" containerID="cri-o://0fc1dd1e460505118499bbacfea4a6ed132b1cc3b385d8b14d06a0a087a4b3b6" gracePeriod=30 Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.253827 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8l8g4"] Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.254442 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8l8g4" podUID="ef64597f-59f1-47be-afc6-aa95fb3c355c" containerName="registry-server" containerID="cri-o://19b545bbd15e5da7e11ea2d5c1514c1735657be1aa7b1402052651578aff3cb8" gracePeriod=30 Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.257318 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-58rfh"] Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.257586 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" podUID="b36ff831-d91c-4350-a36b-bd0625ffb661" containerName="marketplace-operator" containerID="cri-o://192c1b63839f42897cd25ce2219a6d4505efeac1667b70d0d68169cb9800dce5" gracePeriod=30 Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.266665 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vbnc"] Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.266901 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4vbnc" podUID="6b2be658-340d-4dc2-89b8-ee1fbde43d23" containerName="registry-server" containerID="cri-o://b33a9584a38b9c1e3c9360213d802f11a4520acd114bca8b3f868ea70287a2dd" gracePeriod=30 Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.277681 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wrkz2"] Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.277985 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wrkz2" podUID="f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2" containerName="registry-server" containerID="cri-o://4518f90a71984f29288b363af631a11d28801d3d91a5708e28b0e2653acef0e0" gracePeriod=30 Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.288627 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-76f84"] Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.289702 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-76f84" Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.300939 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-76f84"] Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.402035 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d608ef1-7f5b-45c5-80ce-f9be86cd93fe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-76f84\" (UID: \"7d608ef1-7f5b-45c5-80ce-f9be86cd93fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-76f84" Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.402091 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7d608ef1-7f5b-45c5-80ce-f9be86cd93fe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-76f84\" (UID: \"7d608ef1-7f5b-45c5-80ce-f9be86cd93fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-76f84" Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.402116 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhkhm\" (UniqueName: \"kubernetes.io/projected/7d608ef1-7f5b-45c5-80ce-f9be86cd93fe-kube-api-access-jhkhm\") pod \"marketplace-operator-79b997595-76f84\" (UID: \"7d608ef1-7f5b-45c5-80ce-f9be86cd93fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-76f84" Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.503932 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d608ef1-7f5b-45c5-80ce-f9be86cd93fe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-76f84\" (UID: \"7d608ef1-7f5b-45c5-80ce-f9be86cd93fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-76f84" Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.504003 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7d608ef1-7f5b-45c5-80ce-f9be86cd93fe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-76f84\" (UID: \"7d608ef1-7f5b-45c5-80ce-f9be86cd93fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-76f84" Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.504031 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhkhm\" (UniqueName: \"kubernetes.io/projected/7d608ef1-7f5b-45c5-80ce-f9be86cd93fe-kube-api-access-jhkhm\") pod \"marketplace-operator-79b997595-76f84\" (UID: \"7d608ef1-7f5b-45c5-80ce-f9be86cd93fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-76f84" Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.505698 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d608ef1-7f5b-45c5-80ce-f9be86cd93fe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-76f84\" (UID: \"7d608ef1-7f5b-45c5-80ce-f9be86cd93fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-76f84" Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.510013 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7d608ef1-7f5b-45c5-80ce-f9be86cd93fe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-76f84\" (UID: \"7d608ef1-7f5b-45c5-80ce-f9be86cd93fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-76f84" Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.523305 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhkhm\" (UniqueName: \"kubernetes.io/projected/7d608ef1-7f5b-45c5-80ce-f9be86cd93fe-kube-api-access-jhkhm\") pod \"marketplace-operator-79b997595-76f84\" (UID: \"7d608ef1-7f5b-45c5-80ce-f9be86cd93fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-76f84" Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.616918 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-76f84" Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.786314 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dh2bb" Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.916013 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae4804bb-2669-48fc-aa42-3e4f1c94323b-catalog-content\") pod \"ae4804bb-2669-48fc-aa42-3e4f1c94323b\" (UID: \"ae4804bb-2669-48fc-aa42-3e4f1c94323b\") " Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.916154 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae4804bb-2669-48fc-aa42-3e4f1c94323b-utilities\") pod \"ae4804bb-2669-48fc-aa42-3e4f1c94323b\" (UID: \"ae4804bb-2669-48fc-aa42-3e4f1c94323b\") " Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.916187 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4dgn\" (UniqueName: \"kubernetes.io/projected/ae4804bb-2669-48fc-aa42-3e4f1c94323b-kube-api-access-v4dgn\") pod \"ae4804bb-2669-48fc-aa42-3e4f1c94323b\" (UID: \"ae4804bb-2669-48fc-aa42-3e4f1c94323b\") " Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.917429 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae4804bb-2669-48fc-aa42-3e4f1c94323b-utilities" (OuterVolumeSpecName: "utilities") pod "ae4804bb-2669-48fc-aa42-3e4f1c94323b" (UID: "ae4804bb-2669-48fc-aa42-3e4f1c94323b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.921284 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae4804bb-2669-48fc-aa42-3e4f1c94323b-kube-api-access-v4dgn" (OuterVolumeSpecName: "kube-api-access-v4dgn") pod "ae4804bb-2669-48fc-aa42-3e4f1c94323b" (UID: "ae4804bb-2669-48fc-aa42-3e4f1c94323b"). InnerVolumeSpecName "kube-api-access-v4dgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.976070 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8l8g4" Dec 16 15:01:26 crc kubenswrapper[4775]: I1216 15:01:26.986593 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.001249 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vbnc" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.005371 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wrkz2" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.010844 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae4804bb-2669-48fc-aa42-3e4f1c94323b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae4804bb-2669-48fc-aa42-3e4f1c94323b" (UID: "ae4804bb-2669-48fc-aa42-3e4f1c94323b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.017423 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae4804bb-2669-48fc-aa42-3e4f1c94323b-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.017451 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4dgn\" (UniqueName: \"kubernetes.io/projected/ae4804bb-2669-48fc-aa42-3e4f1c94323b-kube-api-access-v4dgn\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.017461 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae4804bb-2669-48fc-aa42-3e4f1c94323b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.118739 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2be658-340d-4dc2-89b8-ee1fbde43d23-catalog-content\") pod \"6b2be658-340d-4dc2-89b8-ee1fbde43d23\" (UID: \"6b2be658-340d-4dc2-89b8-ee1fbde43d23\") " Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.118783 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b36ff831-d91c-4350-a36b-bd0625ffb661-marketplace-operator-metrics\") pod \"b36ff831-d91c-4350-a36b-bd0625ffb661\" (UID: \"b36ff831-d91c-4350-a36b-bd0625ffb661\") " Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.118807 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2-catalog-content\") pod \"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2\" (UID: \"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2\") " Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.118825 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef64597f-59f1-47be-afc6-aa95fb3c355c-utilities\") pod \"ef64597f-59f1-47be-afc6-aa95fb3c355c\" (UID: \"ef64597f-59f1-47be-afc6-aa95fb3c355c\") " Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.118877 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmvtx\" (UniqueName: \"kubernetes.io/projected/ef64597f-59f1-47be-afc6-aa95fb3c355c-kube-api-access-xmvtx\") pod \"ef64597f-59f1-47be-afc6-aa95fb3c355c\" (UID: \"ef64597f-59f1-47be-afc6-aa95fb3c355c\") " Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.118924 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss5ht\" (UniqueName: \"kubernetes.io/projected/b36ff831-d91c-4350-a36b-bd0625ffb661-kube-api-access-ss5ht\") pod \"b36ff831-d91c-4350-a36b-bd0625ffb661\" (UID: \"b36ff831-d91c-4350-a36b-bd0625ffb661\") " Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.118948 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48r5p\" (UniqueName: \"kubernetes.io/projected/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2-kube-api-access-48r5p\") pod \"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2\" (UID: \"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2\") " Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.118983 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef64597f-59f1-47be-afc6-aa95fb3c355c-catalog-content\") pod \"ef64597f-59f1-47be-afc6-aa95fb3c355c\" (UID: \"ef64597f-59f1-47be-afc6-aa95fb3c355c\") " Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.119017 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b36ff831-d91c-4350-a36b-bd0625ffb661-marketplace-trusted-ca\") pod \"b36ff831-d91c-4350-a36b-bd0625ffb661\" (UID: \"b36ff831-d91c-4350-a36b-bd0625ffb661\") " Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.119055 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2be658-340d-4dc2-89b8-ee1fbde43d23-utilities\") pod \"6b2be658-340d-4dc2-89b8-ee1fbde43d23\" (UID: \"6b2be658-340d-4dc2-89b8-ee1fbde43d23\") " Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.119073 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlqbr\" (UniqueName: \"kubernetes.io/projected/6b2be658-340d-4dc2-89b8-ee1fbde43d23-kube-api-access-rlqbr\") pod \"6b2be658-340d-4dc2-89b8-ee1fbde43d23\" (UID: \"6b2be658-340d-4dc2-89b8-ee1fbde43d23\") " Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.119108 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2-utilities\") pod \"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2\" (UID: \"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2\") " Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.120807 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2-utilities" (OuterVolumeSpecName: "utilities") pod "f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2" (UID: "f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.121816 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef64597f-59f1-47be-afc6-aa95fb3c355c-utilities" (OuterVolumeSpecName: "utilities") pod "ef64597f-59f1-47be-afc6-aa95fb3c355c" (UID: "ef64597f-59f1-47be-afc6-aa95fb3c355c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.122181 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b36ff831-d91c-4350-a36b-bd0625ffb661-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b36ff831-d91c-4350-a36b-bd0625ffb661" (UID: "b36ff831-d91c-4350-a36b-bd0625ffb661"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.122622 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b2be658-340d-4dc2-89b8-ee1fbde43d23-utilities" (OuterVolumeSpecName: "utilities") pod "6b2be658-340d-4dc2-89b8-ee1fbde43d23" (UID: "6b2be658-340d-4dc2-89b8-ee1fbde43d23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.124113 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2-kube-api-access-48r5p" (OuterVolumeSpecName: "kube-api-access-48r5p") pod "f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2" (UID: "f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2"). InnerVolumeSpecName "kube-api-access-48r5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.124573 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b36ff831-d91c-4350-a36b-bd0625ffb661-kube-api-access-ss5ht" (OuterVolumeSpecName: "kube-api-access-ss5ht") pod "b36ff831-d91c-4350-a36b-bd0625ffb661" (UID: "b36ff831-d91c-4350-a36b-bd0625ffb661"). InnerVolumeSpecName "kube-api-access-ss5ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.124637 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef64597f-59f1-47be-afc6-aa95fb3c355c-kube-api-access-xmvtx" (OuterVolumeSpecName: "kube-api-access-xmvtx") pod "ef64597f-59f1-47be-afc6-aa95fb3c355c" (UID: "ef64597f-59f1-47be-afc6-aa95fb3c355c"). InnerVolumeSpecName "kube-api-access-xmvtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.124653 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b2be658-340d-4dc2-89b8-ee1fbde43d23-kube-api-access-rlqbr" (OuterVolumeSpecName: "kube-api-access-rlqbr") pod "6b2be658-340d-4dc2-89b8-ee1fbde43d23" (UID: "6b2be658-340d-4dc2-89b8-ee1fbde43d23"). InnerVolumeSpecName "kube-api-access-rlqbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.125615 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36ff831-d91c-4350-a36b-bd0625ffb661-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b36ff831-d91c-4350-a36b-bd0625ffb661" (UID: "b36ff831-d91c-4350-a36b-bd0625ffb661"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.142810 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b2be658-340d-4dc2-89b8-ee1fbde43d23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b2be658-340d-4dc2-89b8-ee1fbde43d23" (UID: "6b2be658-340d-4dc2-89b8-ee1fbde43d23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.178531 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef64597f-59f1-47be-afc6-aa95fb3c355c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef64597f-59f1-47be-afc6-aa95fb3c355c" (UID: "ef64597f-59f1-47be-afc6-aa95fb3c355c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.225420 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2be658-340d-4dc2-89b8-ee1fbde43d23-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.225489 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlqbr\" (UniqueName: \"kubernetes.io/projected/6b2be658-340d-4dc2-89b8-ee1fbde43d23-kube-api-access-rlqbr\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.225501 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.225512 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2be658-340d-4dc2-89b8-ee1fbde43d23-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.225522 4775 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b36ff831-d91c-4350-a36b-bd0625ffb661-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.225535 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef64597f-59f1-47be-afc6-aa95fb3c355c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.225546 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmvtx\" (UniqueName: \"kubernetes.io/projected/ef64597f-59f1-47be-afc6-aa95fb3c355c-kube-api-access-xmvtx\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.225557 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss5ht\" (UniqueName: \"kubernetes.io/projected/b36ff831-d91c-4350-a36b-bd0625ffb661-kube-api-access-ss5ht\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.225567 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48r5p\" (UniqueName: \"kubernetes.io/projected/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2-kube-api-access-48r5p\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.225579 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef64597f-59f1-47be-afc6-aa95fb3c355c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.225589 4775 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b36ff831-d91c-4350-a36b-bd0625ffb661-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.234197 4775 generic.go:334] "Generic (PLEG): container finished" podID="f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2" containerID="4518f90a71984f29288b363af631a11d28801d3d91a5708e28b0e2653acef0e0" exitCode=0 Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.234412 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrkz2" event={"ID":"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2","Type":"ContainerDied","Data":"4518f90a71984f29288b363af631a11d28801d3d91a5708e28b0e2653acef0e0"} Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.234474 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wrkz2" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.234493 4775 scope.go:117] "RemoveContainer" containerID="4518f90a71984f29288b363af631a11d28801d3d91a5708e28b0e2653acef0e0" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.234479 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wrkz2" event={"ID":"f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2","Type":"ContainerDied","Data":"f0450d85832982a65eae76e07b9516ec9bbbd353f005d8ee8952cf05a5e90fe0"} Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.238018 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-76f84"] Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.240779 4775 generic.go:334] "Generic (PLEG): container finished" podID="ae4804bb-2669-48fc-aa42-3e4f1c94323b" containerID="0fc1dd1e460505118499bbacfea4a6ed132b1cc3b385d8b14d06a0a087a4b3b6" exitCode=0 Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.240884 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh2bb" event={"ID":"ae4804bb-2669-48fc-aa42-3e4f1c94323b","Type":"ContainerDied","Data":"0fc1dd1e460505118499bbacfea4a6ed132b1cc3b385d8b14d06a0a087a4b3b6"} Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.240935 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh2bb" event={"ID":"ae4804bb-2669-48fc-aa42-3e4f1c94323b","Type":"ContainerDied","Data":"8823a04aa97e5787bdc9218d608ccf5d912f95993f60aa6f811ee688743c00e7"} Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.241102 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dh2bb" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.242958 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2" (UID: "f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.244154 4775 generic.go:334] "Generic (PLEG): container finished" podID="6b2be658-340d-4dc2-89b8-ee1fbde43d23" containerID="b33a9584a38b9c1e3c9360213d802f11a4520acd114bca8b3f868ea70287a2dd" exitCode=0 Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.244405 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vbnc" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.244519 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vbnc" event={"ID":"6b2be658-340d-4dc2-89b8-ee1fbde43d23","Type":"ContainerDied","Data":"b33a9584a38b9c1e3c9360213d802f11a4520acd114bca8b3f868ea70287a2dd"} Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.244550 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vbnc" event={"ID":"6b2be658-340d-4dc2-89b8-ee1fbde43d23","Type":"ContainerDied","Data":"55d930e33c096b55b10803525772dfdaa0588104d4719de2215ac7312da61919"} Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.247473 4775 generic.go:334] "Generic (PLEG): container finished" podID="b36ff831-d91c-4350-a36b-bd0625ffb661" containerID="192c1b63839f42897cd25ce2219a6d4505efeac1667b70d0d68169cb9800dce5" exitCode=0 Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.247556 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" event={"ID":"b36ff831-d91c-4350-a36b-bd0625ffb661","Type":"ContainerDied","Data":"192c1b63839f42897cd25ce2219a6d4505efeac1667b70d0d68169cb9800dce5"} Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.247585 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" event={"ID":"b36ff831-d91c-4350-a36b-bd0625ffb661","Type":"ContainerDied","Data":"2ee3f783093c346e7cbb789197f327f74cb8c31422af043f96c4391e13e77b8d"} Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.247648 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-58rfh" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.250039 4775 generic.go:334] "Generic (PLEG): container finished" podID="ef64597f-59f1-47be-afc6-aa95fb3c355c" containerID="19b545bbd15e5da7e11ea2d5c1514c1735657be1aa7b1402052651578aff3cb8" exitCode=0 Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.250104 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l8g4" event={"ID":"ef64597f-59f1-47be-afc6-aa95fb3c355c","Type":"ContainerDied","Data":"19b545bbd15e5da7e11ea2d5c1514c1735657be1aa7b1402052651578aff3cb8"} Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.250125 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l8g4" event={"ID":"ef64597f-59f1-47be-afc6-aa95fb3c355c","Type":"ContainerDied","Data":"10874e66f0848717503d87122e93dfcf157714cebc5e058a474a5e5a2540625f"} Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.250105 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8l8g4" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.255264 4775 scope.go:117] "RemoveContainer" containerID="e12314d0ee0b4d0c27ecaf8f3aae7dd048927c3d6027949a33f9d6627ca8deec" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.285489 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dh2bb"] Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.291999 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dh2bb"] Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.295283 4775 scope.go:117] "RemoveContainer" containerID="93821fab426e8805af45bae51a3f590c5dcbf9706873896e55731e596df31a3e" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.297412 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-58rfh"] Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.305664 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-58rfh"] Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.311473 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8l8g4"] Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.319400 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8l8g4"] Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.321928 4775 scope.go:117] "RemoveContainer" containerID="4518f90a71984f29288b363af631a11d28801d3d91a5708e28b0e2653acef0e0" Dec 16 15:01:27 crc kubenswrapper[4775]: E1216 15:01:27.322357 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4518f90a71984f29288b363af631a11d28801d3d91a5708e28b0e2653acef0e0\": container with ID starting with 4518f90a71984f29288b363af631a11d28801d3d91a5708e28b0e2653acef0e0 not found: ID does not exist" containerID="4518f90a71984f29288b363af631a11d28801d3d91a5708e28b0e2653acef0e0" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.322402 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4518f90a71984f29288b363af631a11d28801d3d91a5708e28b0e2653acef0e0"} err="failed to get container status \"4518f90a71984f29288b363af631a11d28801d3d91a5708e28b0e2653acef0e0\": rpc error: code = NotFound desc = could not find container \"4518f90a71984f29288b363af631a11d28801d3d91a5708e28b0e2653acef0e0\": container with ID starting with 4518f90a71984f29288b363af631a11d28801d3d91a5708e28b0e2653acef0e0 not found: ID does not exist" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.322434 4775 scope.go:117] "RemoveContainer" containerID="e12314d0ee0b4d0c27ecaf8f3aae7dd048927c3d6027949a33f9d6627ca8deec" Dec 16 15:01:27 crc kubenswrapper[4775]: E1216 15:01:27.322808 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e12314d0ee0b4d0c27ecaf8f3aae7dd048927c3d6027949a33f9d6627ca8deec\": container with ID starting with e12314d0ee0b4d0c27ecaf8f3aae7dd048927c3d6027949a33f9d6627ca8deec not found: ID does not exist" containerID="e12314d0ee0b4d0c27ecaf8f3aae7dd048927c3d6027949a33f9d6627ca8deec" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.322837 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e12314d0ee0b4d0c27ecaf8f3aae7dd048927c3d6027949a33f9d6627ca8deec"} err="failed to get container status \"e12314d0ee0b4d0c27ecaf8f3aae7dd048927c3d6027949a33f9d6627ca8deec\": rpc error: code = NotFound desc = could not find container \"e12314d0ee0b4d0c27ecaf8f3aae7dd048927c3d6027949a33f9d6627ca8deec\": container with ID starting with e12314d0ee0b4d0c27ecaf8f3aae7dd048927c3d6027949a33f9d6627ca8deec not found: ID does not exist" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.322863 4775 scope.go:117] "RemoveContainer" containerID="93821fab426e8805af45bae51a3f590c5dcbf9706873896e55731e596df31a3e" Dec 16 15:01:27 crc kubenswrapper[4775]: E1216 15:01:27.323217 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93821fab426e8805af45bae51a3f590c5dcbf9706873896e55731e596df31a3e\": container with ID starting with 93821fab426e8805af45bae51a3f590c5dcbf9706873896e55731e596df31a3e not found: ID does not exist" containerID="93821fab426e8805af45bae51a3f590c5dcbf9706873896e55731e596df31a3e" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.323233 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93821fab426e8805af45bae51a3f590c5dcbf9706873896e55731e596df31a3e"} err="failed to get container status \"93821fab426e8805af45bae51a3f590c5dcbf9706873896e55731e596df31a3e\": rpc error: code = NotFound desc = could not find container \"93821fab426e8805af45bae51a3f590c5dcbf9706873896e55731e596df31a3e\": container with ID starting with 93821fab426e8805af45bae51a3f590c5dcbf9706873896e55731e596df31a3e not found: ID does not exist" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.323244 4775 scope.go:117] "RemoveContainer" containerID="0fc1dd1e460505118499bbacfea4a6ed132b1cc3b385d8b14d06a0a087a4b3b6" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.323298 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vbnc"] Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.325867 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vbnc"] Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.326223 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.344302 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b2be658-340d-4dc2-89b8-ee1fbde43d23" path="/var/lib/kubelet/pods/6b2be658-340d-4dc2-89b8-ee1fbde43d23/volumes" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.344989 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae4804bb-2669-48fc-aa42-3e4f1c94323b" path="/var/lib/kubelet/pods/ae4804bb-2669-48fc-aa42-3e4f1c94323b/volumes" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.345780 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b36ff831-d91c-4350-a36b-bd0625ffb661" path="/var/lib/kubelet/pods/b36ff831-d91c-4350-a36b-bd0625ffb661/volumes" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.346625 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef64597f-59f1-47be-afc6-aa95fb3c355c" path="/var/lib/kubelet/pods/ef64597f-59f1-47be-afc6-aa95fb3c355c/volumes" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.358089 4775 scope.go:117] "RemoveContainer" containerID="faa1a80b890f64b2a41dde8f97aabaf1e4552b03d3502e131a4c42cbe5f475f4" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.391151 4775 scope.go:117] "RemoveContainer" containerID="58dcd0089058b8564ce80e9ea21cfdd37b7245476ab12ddf99ca4d623da254d8" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.462150 4775 scope.go:117] "RemoveContainer" containerID="0fc1dd1e460505118499bbacfea4a6ed132b1cc3b385d8b14d06a0a087a4b3b6" Dec 16 15:01:27 crc kubenswrapper[4775]: E1216 15:01:27.462583 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fc1dd1e460505118499bbacfea4a6ed132b1cc3b385d8b14d06a0a087a4b3b6\": container with ID starting with 0fc1dd1e460505118499bbacfea4a6ed132b1cc3b385d8b14d06a0a087a4b3b6 not found: ID does not exist" containerID="0fc1dd1e460505118499bbacfea4a6ed132b1cc3b385d8b14d06a0a087a4b3b6" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.462611 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fc1dd1e460505118499bbacfea4a6ed132b1cc3b385d8b14d06a0a087a4b3b6"} err="failed to get container status \"0fc1dd1e460505118499bbacfea4a6ed132b1cc3b385d8b14d06a0a087a4b3b6\": rpc error: code = NotFound desc = could not find container \"0fc1dd1e460505118499bbacfea4a6ed132b1cc3b385d8b14d06a0a087a4b3b6\": container with ID starting with 0fc1dd1e460505118499bbacfea4a6ed132b1cc3b385d8b14d06a0a087a4b3b6 not found: ID does not exist" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.462637 4775 scope.go:117] "RemoveContainer" containerID="faa1a80b890f64b2a41dde8f97aabaf1e4552b03d3502e131a4c42cbe5f475f4" Dec 16 15:01:27 crc kubenswrapper[4775]: E1216 15:01:27.463046 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faa1a80b890f64b2a41dde8f97aabaf1e4552b03d3502e131a4c42cbe5f475f4\": container with ID starting with faa1a80b890f64b2a41dde8f97aabaf1e4552b03d3502e131a4c42cbe5f475f4 not found: ID does not exist" containerID="faa1a80b890f64b2a41dde8f97aabaf1e4552b03d3502e131a4c42cbe5f475f4" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.463076 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faa1a80b890f64b2a41dde8f97aabaf1e4552b03d3502e131a4c42cbe5f475f4"} err="failed to get container status \"faa1a80b890f64b2a41dde8f97aabaf1e4552b03d3502e131a4c42cbe5f475f4\": rpc error: code = NotFound desc = could not find container \"faa1a80b890f64b2a41dde8f97aabaf1e4552b03d3502e131a4c42cbe5f475f4\": container with ID starting with faa1a80b890f64b2a41dde8f97aabaf1e4552b03d3502e131a4c42cbe5f475f4 not found: ID does not exist" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.463094 4775 scope.go:117] "RemoveContainer" containerID="58dcd0089058b8564ce80e9ea21cfdd37b7245476ab12ddf99ca4d623da254d8" Dec 16 15:01:27 crc kubenswrapper[4775]: E1216 15:01:27.463538 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58dcd0089058b8564ce80e9ea21cfdd37b7245476ab12ddf99ca4d623da254d8\": container with ID starting with 58dcd0089058b8564ce80e9ea21cfdd37b7245476ab12ddf99ca4d623da254d8 not found: ID does not exist" containerID="58dcd0089058b8564ce80e9ea21cfdd37b7245476ab12ddf99ca4d623da254d8" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.463587 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58dcd0089058b8564ce80e9ea21cfdd37b7245476ab12ddf99ca4d623da254d8"} err="failed to get container status \"58dcd0089058b8564ce80e9ea21cfdd37b7245476ab12ddf99ca4d623da254d8\": rpc error: code = NotFound desc = could not find container \"58dcd0089058b8564ce80e9ea21cfdd37b7245476ab12ddf99ca4d623da254d8\": container with ID starting with 58dcd0089058b8564ce80e9ea21cfdd37b7245476ab12ddf99ca4d623da254d8 not found: ID does not exist" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.463627 4775 scope.go:117] "RemoveContainer" containerID="b33a9584a38b9c1e3c9360213d802f11a4520acd114bca8b3f868ea70287a2dd" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.479457 4775 scope.go:117] "RemoveContainer" containerID="04426c3e77dc660a0a38af544b3e05b151570f49de7c9152482a6304bfdd8bd5" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.492840 4775 scope.go:117] "RemoveContainer" containerID="ea057c9d189586afd6c603cc5fa8b9aabf053b1f8e2c4a03fd2e693052c0a487" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.512046 4775 scope.go:117] "RemoveContainer" containerID="b33a9584a38b9c1e3c9360213d802f11a4520acd114bca8b3f868ea70287a2dd" Dec 16 15:01:27 crc kubenswrapper[4775]: E1216 15:01:27.512515 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b33a9584a38b9c1e3c9360213d802f11a4520acd114bca8b3f868ea70287a2dd\": container with ID starting with b33a9584a38b9c1e3c9360213d802f11a4520acd114bca8b3f868ea70287a2dd not found: ID does not exist" containerID="b33a9584a38b9c1e3c9360213d802f11a4520acd114bca8b3f868ea70287a2dd" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.512561 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b33a9584a38b9c1e3c9360213d802f11a4520acd114bca8b3f868ea70287a2dd"} err="failed to get container status \"b33a9584a38b9c1e3c9360213d802f11a4520acd114bca8b3f868ea70287a2dd\": rpc error: code = NotFound desc = could not find container \"b33a9584a38b9c1e3c9360213d802f11a4520acd114bca8b3f868ea70287a2dd\": container with ID starting with b33a9584a38b9c1e3c9360213d802f11a4520acd114bca8b3f868ea70287a2dd not found: ID does not exist" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.512594 4775 scope.go:117] "RemoveContainer" containerID="04426c3e77dc660a0a38af544b3e05b151570f49de7c9152482a6304bfdd8bd5" Dec 16 15:01:27 crc kubenswrapper[4775]: E1216 15:01:27.512850 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04426c3e77dc660a0a38af544b3e05b151570f49de7c9152482a6304bfdd8bd5\": container with ID starting with 04426c3e77dc660a0a38af544b3e05b151570f49de7c9152482a6304bfdd8bd5 not found: ID does not exist" containerID="04426c3e77dc660a0a38af544b3e05b151570f49de7c9152482a6304bfdd8bd5" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.512907 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04426c3e77dc660a0a38af544b3e05b151570f49de7c9152482a6304bfdd8bd5"} err="failed to get container status \"04426c3e77dc660a0a38af544b3e05b151570f49de7c9152482a6304bfdd8bd5\": rpc error: code = NotFound desc = could not find container \"04426c3e77dc660a0a38af544b3e05b151570f49de7c9152482a6304bfdd8bd5\": container with ID starting with 04426c3e77dc660a0a38af544b3e05b151570f49de7c9152482a6304bfdd8bd5 not found: ID does not exist" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.512924 4775 scope.go:117] "RemoveContainer" containerID="ea057c9d189586afd6c603cc5fa8b9aabf053b1f8e2c4a03fd2e693052c0a487" Dec 16 15:01:27 crc kubenswrapper[4775]: E1216 15:01:27.513247 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea057c9d189586afd6c603cc5fa8b9aabf053b1f8e2c4a03fd2e693052c0a487\": container with ID starting with ea057c9d189586afd6c603cc5fa8b9aabf053b1f8e2c4a03fd2e693052c0a487 not found: ID does not exist" containerID="ea057c9d189586afd6c603cc5fa8b9aabf053b1f8e2c4a03fd2e693052c0a487" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.513307 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea057c9d189586afd6c603cc5fa8b9aabf053b1f8e2c4a03fd2e693052c0a487"} err="failed to get container status \"ea057c9d189586afd6c603cc5fa8b9aabf053b1f8e2c4a03fd2e693052c0a487\": rpc error: code = NotFound desc = could not find container \"ea057c9d189586afd6c603cc5fa8b9aabf053b1f8e2c4a03fd2e693052c0a487\": container with ID starting with ea057c9d189586afd6c603cc5fa8b9aabf053b1f8e2c4a03fd2e693052c0a487 not found: ID does not exist" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.513347 4775 scope.go:117] "RemoveContainer" containerID="192c1b63839f42897cd25ce2219a6d4505efeac1667b70d0d68169cb9800dce5" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.533226 4775 scope.go:117] "RemoveContainer" containerID="192c1b63839f42897cd25ce2219a6d4505efeac1667b70d0d68169cb9800dce5" Dec 16 15:01:27 crc kubenswrapper[4775]: E1216 15:01:27.533737 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"192c1b63839f42897cd25ce2219a6d4505efeac1667b70d0d68169cb9800dce5\": container with ID starting with 192c1b63839f42897cd25ce2219a6d4505efeac1667b70d0d68169cb9800dce5 not found: ID does not exist" containerID="192c1b63839f42897cd25ce2219a6d4505efeac1667b70d0d68169cb9800dce5" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.533779 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192c1b63839f42897cd25ce2219a6d4505efeac1667b70d0d68169cb9800dce5"} err="failed to get container status \"192c1b63839f42897cd25ce2219a6d4505efeac1667b70d0d68169cb9800dce5\": rpc error: code = NotFound desc = could not find container \"192c1b63839f42897cd25ce2219a6d4505efeac1667b70d0d68169cb9800dce5\": container with ID starting with 192c1b63839f42897cd25ce2219a6d4505efeac1667b70d0d68169cb9800dce5 not found: ID does not exist" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.533808 4775 scope.go:117] "RemoveContainer" containerID="19b545bbd15e5da7e11ea2d5c1514c1735657be1aa7b1402052651578aff3cb8" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.548718 4775 scope.go:117] "RemoveContainer" containerID="ce1cc3a5386474f059988a815e4b5b16ec62d935de75bda749ca6188a5f3aded" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.553108 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wrkz2"] Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.559150 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wrkz2"] Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.569784 4775 scope.go:117] "RemoveContainer" containerID="44aee6d9217e503300e08cebfa19294d0efbda3bfcfad3da2fa3d83defde84ea" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.584650 4775 scope.go:117] "RemoveContainer" containerID="19b545bbd15e5da7e11ea2d5c1514c1735657be1aa7b1402052651578aff3cb8" Dec 16 15:01:27 crc kubenswrapper[4775]: E1216 15:01:27.585063 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19b545bbd15e5da7e11ea2d5c1514c1735657be1aa7b1402052651578aff3cb8\": container with ID starting with 19b545bbd15e5da7e11ea2d5c1514c1735657be1aa7b1402052651578aff3cb8 not found: ID does not exist" containerID="19b545bbd15e5da7e11ea2d5c1514c1735657be1aa7b1402052651578aff3cb8" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.585097 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19b545bbd15e5da7e11ea2d5c1514c1735657be1aa7b1402052651578aff3cb8"} err="failed to get container status \"19b545bbd15e5da7e11ea2d5c1514c1735657be1aa7b1402052651578aff3cb8\": rpc error: code = NotFound desc = could not find container \"19b545bbd15e5da7e11ea2d5c1514c1735657be1aa7b1402052651578aff3cb8\": container with ID starting with 19b545bbd15e5da7e11ea2d5c1514c1735657be1aa7b1402052651578aff3cb8 not found: ID does not exist" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.585121 4775 scope.go:117] "RemoveContainer" containerID="ce1cc3a5386474f059988a815e4b5b16ec62d935de75bda749ca6188a5f3aded" Dec 16 15:01:27 crc kubenswrapper[4775]: E1216 15:01:27.585553 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce1cc3a5386474f059988a815e4b5b16ec62d935de75bda749ca6188a5f3aded\": container with ID starting with ce1cc3a5386474f059988a815e4b5b16ec62d935de75bda749ca6188a5f3aded not found: ID does not exist" containerID="ce1cc3a5386474f059988a815e4b5b16ec62d935de75bda749ca6188a5f3aded" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.585573 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce1cc3a5386474f059988a815e4b5b16ec62d935de75bda749ca6188a5f3aded"} err="failed to get container status \"ce1cc3a5386474f059988a815e4b5b16ec62d935de75bda749ca6188a5f3aded\": rpc error: code = NotFound desc = could not find container \"ce1cc3a5386474f059988a815e4b5b16ec62d935de75bda749ca6188a5f3aded\": container with ID starting with ce1cc3a5386474f059988a815e4b5b16ec62d935de75bda749ca6188a5f3aded not found: ID does not exist" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.585586 4775 scope.go:117] "RemoveContainer" containerID="44aee6d9217e503300e08cebfa19294d0efbda3bfcfad3da2fa3d83defde84ea" Dec 16 15:01:27 crc kubenswrapper[4775]: E1216 15:01:27.585850 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44aee6d9217e503300e08cebfa19294d0efbda3bfcfad3da2fa3d83defde84ea\": container with ID starting with 44aee6d9217e503300e08cebfa19294d0efbda3bfcfad3da2fa3d83defde84ea not found: ID does not exist" containerID="44aee6d9217e503300e08cebfa19294d0efbda3bfcfad3da2fa3d83defde84ea" Dec 16 15:01:27 crc kubenswrapper[4775]: I1216 15:01:27.585868 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44aee6d9217e503300e08cebfa19294d0efbda3bfcfad3da2fa3d83defde84ea"} err="failed to get container status \"44aee6d9217e503300e08cebfa19294d0efbda3bfcfad3da2fa3d83defde84ea\": rpc error: code = NotFound desc = could not find container \"44aee6d9217e503300e08cebfa19294d0efbda3bfcfad3da2fa3d83defde84ea\": container with ID starting with 44aee6d9217e503300e08cebfa19294d0efbda3bfcfad3da2fa3d83defde84ea not found: ID does not exist" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.250401 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dzt2q"] Dec 16 15:01:28 crc kubenswrapper[4775]: E1216 15:01:28.250876 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae4804bb-2669-48fc-aa42-3e4f1c94323b" containerName="registry-server" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.250902 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae4804bb-2669-48fc-aa42-3e4f1c94323b" containerName="registry-server" Dec 16 15:01:28 crc kubenswrapper[4775]: E1216 15:01:28.250912 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2" containerName="extract-utilities" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.250920 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2" containerName="extract-utilities" Dec 16 15:01:28 crc kubenswrapper[4775]: E1216 15:01:28.250929 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae4804bb-2669-48fc-aa42-3e4f1c94323b" containerName="extract-utilities" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.250937 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae4804bb-2669-48fc-aa42-3e4f1c94323b" containerName="extract-utilities" Dec 16 15:01:28 crc kubenswrapper[4775]: E1216 15:01:28.250951 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2be658-340d-4dc2-89b8-ee1fbde43d23" containerName="registry-server" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.250961 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2be658-340d-4dc2-89b8-ee1fbde43d23" containerName="registry-server" Dec 16 15:01:28 crc kubenswrapper[4775]: E1216 15:01:28.250976 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef64597f-59f1-47be-afc6-aa95fb3c355c" containerName="extract-content" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.250984 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef64597f-59f1-47be-afc6-aa95fb3c355c" containerName="extract-content" Dec 16 15:01:28 crc kubenswrapper[4775]: E1216 15:01:28.250997 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef64597f-59f1-47be-afc6-aa95fb3c355c" containerName="extract-utilities" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.251005 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef64597f-59f1-47be-afc6-aa95fb3c355c" containerName="extract-utilities" Dec 16 15:01:28 crc kubenswrapper[4775]: E1216 15:01:28.251014 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36ff831-d91c-4350-a36b-bd0625ffb661" containerName="marketplace-operator" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.251021 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36ff831-d91c-4350-a36b-bd0625ffb661" containerName="marketplace-operator" Dec 16 15:01:28 crc kubenswrapper[4775]: E1216 15:01:28.251028 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae4804bb-2669-48fc-aa42-3e4f1c94323b" containerName="extract-content" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.251033 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae4804bb-2669-48fc-aa42-3e4f1c94323b" containerName="extract-content" Dec 16 15:01:28 crc kubenswrapper[4775]: E1216 15:01:28.251042 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2" containerName="extract-content" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.251047 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2" containerName="extract-content" Dec 16 15:01:28 crc kubenswrapper[4775]: E1216 15:01:28.251055 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2" containerName="registry-server" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.251061 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2" containerName="registry-server" Dec 16 15:01:28 crc kubenswrapper[4775]: E1216 15:01:28.251068 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef64597f-59f1-47be-afc6-aa95fb3c355c" containerName="registry-server" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.251073 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef64597f-59f1-47be-afc6-aa95fb3c355c" containerName="registry-server" Dec 16 15:01:28 crc kubenswrapper[4775]: E1216 15:01:28.251082 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2be658-340d-4dc2-89b8-ee1fbde43d23" containerName="extract-content" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.251088 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2be658-340d-4dc2-89b8-ee1fbde43d23" containerName="extract-content" Dec 16 15:01:28 crc kubenswrapper[4775]: E1216 15:01:28.251097 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2be658-340d-4dc2-89b8-ee1fbde43d23" containerName="extract-utilities" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.251103 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2be658-340d-4dc2-89b8-ee1fbde43d23" containerName="extract-utilities" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.251207 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2" containerName="registry-server" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.251218 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef64597f-59f1-47be-afc6-aa95fb3c355c" containerName="registry-server" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.251228 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b36ff831-d91c-4350-a36b-bd0625ffb661" containerName="marketplace-operator" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.251235 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae4804bb-2669-48fc-aa42-3e4f1c94323b" containerName="registry-server" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.251245 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b2be658-340d-4dc2-89b8-ee1fbde43d23" containerName="registry-server" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.251997 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzt2q" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.254249 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.258966 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dzt2q"] Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.263731 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-76f84" event={"ID":"7d608ef1-7f5b-45c5-80ce-f9be86cd93fe","Type":"ContainerStarted","Data":"6b129eebb44c4319a352b9893942bf3472ab0be90b20ea2e27841474cf5ea8c5"} Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.263785 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-76f84" event={"ID":"7d608ef1-7f5b-45c5-80ce-f9be86cd93fe","Type":"ContainerStarted","Data":"455b9d300d50d9822d2755146a3452326527f9c14510cd216ad7021cd1fd17f6"} Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.263809 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-76f84" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.268003 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-76f84" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.291697 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-76f84" podStartSLOduration=2.291679124 podStartE2EDuration="2.291679124s" podCreationTimestamp="2025-12-16 15:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:01:28.291526979 +0000 UTC m=+413.242605902" watchObservedRunningTime="2025-12-16 15:01:28.291679124 +0000 UTC m=+413.242758047" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.345701 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2be58473-7d1b-4c58-a3a7-862cd4846f63-utilities\") pod \"certified-operators-dzt2q\" (UID: \"2be58473-7d1b-4c58-a3a7-862cd4846f63\") " pod="openshift-marketplace/certified-operators-dzt2q" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.345778 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9lzs\" (UniqueName: \"kubernetes.io/projected/2be58473-7d1b-4c58-a3a7-862cd4846f63-kube-api-access-q9lzs\") pod \"certified-operators-dzt2q\" (UID: \"2be58473-7d1b-4c58-a3a7-862cd4846f63\") " pod="openshift-marketplace/certified-operators-dzt2q" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.345882 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2be58473-7d1b-4c58-a3a7-862cd4846f63-catalog-content\") pod \"certified-operators-dzt2q\" (UID: \"2be58473-7d1b-4c58-a3a7-862cd4846f63\") " pod="openshift-marketplace/certified-operators-dzt2q" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.447405 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9lzs\" (UniqueName: \"kubernetes.io/projected/2be58473-7d1b-4c58-a3a7-862cd4846f63-kube-api-access-q9lzs\") pod \"certified-operators-dzt2q\" (UID: \"2be58473-7d1b-4c58-a3a7-862cd4846f63\") " pod="openshift-marketplace/certified-operators-dzt2q" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.447563 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2be58473-7d1b-4c58-a3a7-862cd4846f63-catalog-content\") pod \"certified-operators-dzt2q\" (UID: \"2be58473-7d1b-4c58-a3a7-862cd4846f63\") " pod="openshift-marketplace/certified-operators-dzt2q" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.447593 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2be58473-7d1b-4c58-a3a7-862cd4846f63-utilities\") pod \"certified-operators-dzt2q\" (UID: \"2be58473-7d1b-4c58-a3a7-862cd4846f63\") " pod="openshift-marketplace/certified-operators-dzt2q" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.448642 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2be58473-7d1b-4c58-a3a7-862cd4846f63-catalog-content\") pod \"certified-operators-dzt2q\" (UID: \"2be58473-7d1b-4c58-a3a7-862cd4846f63\") " pod="openshift-marketplace/certified-operators-dzt2q" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.448774 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2be58473-7d1b-4c58-a3a7-862cd4846f63-utilities\") pod \"certified-operators-dzt2q\" (UID: \"2be58473-7d1b-4c58-a3a7-862cd4846f63\") " pod="openshift-marketplace/certified-operators-dzt2q" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.466432 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9lzs\" (UniqueName: \"kubernetes.io/projected/2be58473-7d1b-4c58-a3a7-862cd4846f63-kube-api-access-q9lzs\") pod \"certified-operators-dzt2q\" (UID: \"2be58473-7d1b-4c58-a3a7-862cd4846f63\") " pod="openshift-marketplace/certified-operators-dzt2q" Dec 16 15:01:28 crc kubenswrapper[4775]: I1216 15:01:28.590032 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzt2q" Dec 16 15:01:29 crc kubenswrapper[4775]: I1216 15:01:28.851351 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-67zrm"] Dec 16 15:01:29 crc kubenswrapper[4775]: I1216 15:01:28.852976 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67zrm" Dec 16 15:01:29 crc kubenswrapper[4775]: I1216 15:01:28.858418 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 16 15:01:29 crc kubenswrapper[4775]: I1216 15:01:28.863836 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67zrm"] Dec 16 15:01:29 crc kubenswrapper[4775]: I1216 15:01:28.954731 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59df28e1-27a5-451d-9784-a30eba2a3dc0-catalog-content\") pod \"redhat-marketplace-67zrm\" (UID: \"59df28e1-27a5-451d-9784-a30eba2a3dc0\") " pod="openshift-marketplace/redhat-marketplace-67zrm" Dec 16 15:01:29 crc kubenswrapper[4775]: I1216 15:01:28.954809 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm67b\" (UniqueName: \"kubernetes.io/projected/59df28e1-27a5-451d-9784-a30eba2a3dc0-kube-api-access-xm67b\") pod \"redhat-marketplace-67zrm\" (UID: \"59df28e1-27a5-451d-9784-a30eba2a3dc0\") " pod="openshift-marketplace/redhat-marketplace-67zrm" Dec 16 15:01:29 crc kubenswrapper[4775]: I1216 15:01:28.954873 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59df28e1-27a5-451d-9784-a30eba2a3dc0-utilities\") pod \"redhat-marketplace-67zrm\" (UID: \"59df28e1-27a5-451d-9784-a30eba2a3dc0\") " pod="openshift-marketplace/redhat-marketplace-67zrm" Dec 16 15:01:29 crc kubenswrapper[4775]: I1216 15:01:29.056473 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59df28e1-27a5-451d-9784-a30eba2a3dc0-utilities\") pod \"redhat-marketplace-67zrm\" (UID: \"59df28e1-27a5-451d-9784-a30eba2a3dc0\") " pod="openshift-marketplace/redhat-marketplace-67zrm" Dec 16 15:01:29 crc kubenswrapper[4775]: I1216 15:01:29.056588 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59df28e1-27a5-451d-9784-a30eba2a3dc0-catalog-content\") pod \"redhat-marketplace-67zrm\" (UID: \"59df28e1-27a5-451d-9784-a30eba2a3dc0\") " pod="openshift-marketplace/redhat-marketplace-67zrm" Dec 16 15:01:29 crc kubenswrapper[4775]: I1216 15:01:29.056636 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm67b\" (UniqueName: \"kubernetes.io/projected/59df28e1-27a5-451d-9784-a30eba2a3dc0-kube-api-access-xm67b\") pod \"redhat-marketplace-67zrm\" (UID: \"59df28e1-27a5-451d-9784-a30eba2a3dc0\") " pod="openshift-marketplace/redhat-marketplace-67zrm" Dec 16 15:01:29 crc kubenswrapper[4775]: I1216 15:01:29.057239 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59df28e1-27a5-451d-9784-a30eba2a3dc0-catalog-content\") pod \"redhat-marketplace-67zrm\" (UID: \"59df28e1-27a5-451d-9784-a30eba2a3dc0\") " pod="openshift-marketplace/redhat-marketplace-67zrm" Dec 16 15:01:29 crc kubenswrapper[4775]: I1216 15:01:29.057234 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59df28e1-27a5-451d-9784-a30eba2a3dc0-utilities\") pod \"redhat-marketplace-67zrm\" (UID: \"59df28e1-27a5-451d-9784-a30eba2a3dc0\") " pod="openshift-marketplace/redhat-marketplace-67zrm" Dec 16 15:01:29 crc kubenswrapper[4775]: I1216 15:01:29.075720 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm67b\" (UniqueName: \"kubernetes.io/projected/59df28e1-27a5-451d-9784-a30eba2a3dc0-kube-api-access-xm67b\") pod \"redhat-marketplace-67zrm\" (UID: \"59df28e1-27a5-451d-9784-a30eba2a3dc0\") " pod="openshift-marketplace/redhat-marketplace-67zrm" Dec 16 15:01:29 crc kubenswrapper[4775]: I1216 15:01:29.169245 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67zrm" Dec 16 15:01:29 crc kubenswrapper[4775]: I1216 15:01:29.344525 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2" path="/var/lib/kubelet/pods/f2edf9e1-eb45-47d4-a91d-ab4e4b0f90e2/volumes" Dec 16 15:01:30 crc kubenswrapper[4775]: I1216 15:01:30.017468 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dzt2q"] Dec 16 15:01:30 crc kubenswrapper[4775]: I1216 15:01:30.024068 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67zrm"] Dec 16 15:01:30 crc kubenswrapper[4775]: I1216 15:01:30.294339 4775 generic.go:334] "Generic (PLEG): container finished" podID="2be58473-7d1b-4c58-a3a7-862cd4846f63" containerID="b92763adec0fcd3e8c491aea68f0b51226fa3b03c52458d52081832282ca5d3e" exitCode=0 Dec 16 15:01:30 crc kubenswrapper[4775]: I1216 15:01:30.294405 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzt2q" event={"ID":"2be58473-7d1b-4c58-a3a7-862cd4846f63","Type":"ContainerDied","Data":"b92763adec0fcd3e8c491aea68f0b51226fa3b03c52458d52081832282ca5d3e"} Dec 16 15:01:30 crc kubenswrapper[4775]: I1216 15:01:30.294695 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzt2q" event={"ID":"2be58473-7d1b-4c58-a3a7-862cd4846f63","Type":"ContainerStarted","Data":"08784e40323c4dd5d7099fafe2e9813c1c907379d83841b889768ef487b9deaf"} Dec 16 15:01:30 crc kubenswrapper[4775]: I1216 15:01:30.297383 4775 generic.go:334] "Generic (PLEG): container finished" podID="59df28e1-27a5-451d-9784-a30eba2a3dc0" containerID="c5d77eed6d9e98928fc4f5247832106d4301d90a1832d24add4277677a07a486" exitCode=0 Dec 16 15:01:30 crc kubenswrapper[4775]: I1216 15:01:30.297816 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67zrm" event={"ID":"59df28e1-27a5-451d-9784-a30eba2a3dc0","Type":"ContainerDied","Data":"c5d77eed6d9e98928fc4f5247832106d4301d90a1832d24add4277677a07a486"} Dec 16 15:01:30 crc kubenswrapper[4775]: I1216 15:01:30.297837 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67zrm" event={"ID":"59df28e1-27a5-451d-9784-a30eba2a3dc0","Type":"ContainerStarted","Data":"82dacf2bfd70ac4110311e8ddd0ad13f15ced3650bee256444f99a4f592f8a86"} Dec 16 15:01:30 crc kubenswrapper[4775]: I1216 15:01:30.846805 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vj7n7"] Dec 16 15:01:30 crc kubenswrapper[4775]: I1216 15:01:30.848135 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vj7n7" Dec 16 15:01:30 crc kubenswrapper[4775]: I1216 15:01:30.850030 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 16 15:01:30 crc kubenswrapper[4775]: I1216 15:01:30.856346 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vj7n7"] Dec 16 15:01:30 crc kubenswrapper[4775]: I1216 15:01:30.988645 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e02474-416e-4434-b482-2df56ae4c6a7-catalog-content\") pod \"redhat-operators-vj7n7\" (UID: \"f0e02474-416e-4434-b482-2df56ae4c6a7\") " pod="openshift-marketplace/redhat-operators-vj7n7" Dec 16 15:01:30 crc kubenswrapper[4775]: I1216 15:01:30.988737 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fckz\" (UniqueName: \"kubernetes.io/projected/f0e02474-416e-4434-b482-2df56ae4c6a7-kube-api-access-5fckz\") pod \"redhat-operators-vj7n7\" (UID: \"f0e02474-416e-4434-b482-2df56ae4c6a7\") " pod="openshift-marketplace/redhat-operators-vj7n7" Dec 16 15:01:30 crc kubenswrapper[4775]: I1216 15:01:30.988869 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e02474-416e-4434-b482-2df56ae4c6a7-utilities\") pod \"redhat-operators-vj7n7\" (UID: \"f0e02474-416e-4434-b482-2df56ae4c6a7\") " pod="openshift-marketplace/redhat-operators-vj7n7" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.090144 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e02474-416e-4434-b482-2df56ae4c6a7-catalog-content\") pod \"redhat-operators-vj7n7\" (UID: \"f0e02474-416e-4434-b482-2df56ae4c6a7\") " pod="openshift-marketplace/redhat-operators-vj7n7" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.090812 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e02474-416e-4434-b482-2df56ae4c6a7-catalog-content\") pod \"redhat-operators-vj7n7\" (UID: \"f0e02474-416e-4434-b482-2df56ae4c6a7\") " pod="openshift-marketplace/redhat-operators-vj7n7" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.090927 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fckz\" (UniqueName: \"kubernetes.io/projected/f0e02474-416e-4434-b482-2df56ae4c6a7-kube-api-access-5fckz\") pod \"redhat-operators-vj7n7\" (UID: \"f0e02474-416e-4434-b482-2df56ae4c6a7\") " pod="openshift-marketplace/redhat-operators-vj7n7" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.090961 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e02474-416e-4434-b482-2df56ae4c6a7-utilities\") pod \"redhat-operators-vj7n7\" (UID: \"f0e02474-416e-4434-b482-2df56ae4c6a7\") " pod="openshift-marketplace/redhat-operators-vj7n7" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.091281 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e02474-416e-4434-b482-2df56ae4c6a7-utilities\") pod \"redhat-operators-vj7n7\" (UID: \"f0e02474-416e-4434-b482-2df56ae4c6a7\") " pod="openshift-marketplace/redhat-operators-vj7n7" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.114972 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fckz\" (UniqueName: \"kubernetes.io/projected/f0e02474-416e-4434-b482-2df56ae4c6a7-kube-api-access-5fckz\") pod \"redhat-operators-vj7n7\" (UID: \"f0e02474-416e-4434-b482-2df56ae4c6a7\") " pod="openshift-marketplace/redhat-operators-vj7n7" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.166848 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vj7n7" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.326414 4775 generic.go:334] "Generic (PLEG): container finished" podID="59df28e1-27a5-451d-9784-a30eba2a3dc0" containerID="23bbf2b0e523ad3de35a0aae69b18588aa2bc6cfafa082648caa0a8215b3b564" exitCode=0 Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.326505 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67zrm" event={"ID":"59df28e1-27a5-451d-9784-a30eba2a3dc0","Type":"ContainerDied","Data":"23bbf2b0e523ad3de35a0aae69b18588aa2bc6cfafa082648caa0a8215b3b564"} Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.448222 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fwsw2"] Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.449467 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fwsw2" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.453845 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.459131 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fwsw2"] Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.603630 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vj7n7"] Dec 16 15:01:31 crc kubenswrapper[4775]: W1216 15:01:31.611021 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0e02474_416e_4434_b482_2df56ae4c6a7.slice/crio-6c3da41871fb31447465411bc5aaa8f65ded6ff9ab5309b9ef5f56a1ffad3cfe WatchSource:0}: Error finding container 6c3da41871fb31447465411bc5aaa8f65ded6ff9ab5309b9ef5f56a1ffad3cfe: Status 404 returned error can't find the container with id 6c3da41871fb31447465411bc5aaa8f65ded6ff9ab5309b9ef5f56a1ffad3cfe Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.622793 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhgvg\" (UniqueName: \"kubernetes.io/projected/37c67918-469b-4d46-aabb-63b96e941479-kube-api-access-qhgvg\") pod \"community-operators-fwsw2\" (UID: \"37c67918-469b-4d46-aabb-63b96e941479\") " pod="openshift-marketplace/community-operators-fwsw2" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.622905 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37c67918-469b-4d46-aabb-63b96e941479-catalog-content\") pod \"community-operators-fwsw2\" (UID: \"37c67918-469b-4d46-aabb-63b96e941479\") " pod="openshift-marketplace/community-operators-fwsw2" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.622953 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37c67918-469b-4d46-aabb-63b96e941479-utilities\") pod \"community-operators-fwsw2\" (UID: \"37c67918-469b-4d46-aabb-63b96e941479\") " pod="openshift-marketplace/community-operators-fwsw2" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.723929 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37c67918-469b-4d46-aabb-63b96e941479-utilities\") pod \"community-operators-fwsw2\" (UID: \"37c67918-469b-4d46-aabb-63b96e941479\") " pod="openshift-marketplace/community-operators-fwsw2" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.723994 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhgvg\" (UniqueName: \"kubernetes.io/projected/37c67918-469b-4d46-aabb-63b96e941479-kube-api-access-qhgvg\") pod \"community-operators-fwsw2\" (UID: \"37c67918-469b-4d46-aabb-63b96e941479\") " pod="openshift-marketplace/community-operators-fwsw2" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.724053 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37c67918-469b-4d46-aabb-63b96e941479-catalog-content\") pod \"community-operators-fwsw2\" (UID: \"37c67918-469b-4d46-aabb-63b96e941479\") " pod="openshift-marketplace/community-operators-fwsw2" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.724502 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37c67918-469b-4d46-aabb-63b96e941479-catalog-content\") pod \"community-operators-fwsw2\" (UID: \"37c67918-469b-4d46-aabb-63b96e941479\") " pod="openshift-marketplace/community-operators-fwsw2" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.724602 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37c67918-469b-4d46-aabb-63b96e941479-utilities\") pod \"community-operators-fwsw2\" (UID: \"37c67918-469b-4d46-aabb-63b96e941479\") " pod="openshift-marketplace/community-operators-fwsw2" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.743543 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhgvg\" (UniqueName: \"kubernetes.io/projected/37c67918-469b-4d46-aabb-63b96e941479-kube-api-access-qhgvg\") pod \"community-operators-fwsw2\" (UID: \"37c67918-469b-4d46-aabb-63b96e941479\") " pod="openshift-marketplace/community-operators-fwsw2" Dec 16 15:01:31 crc kubenswrapper[4775]: I1216 15:01:31.766446 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fwsw2" Dec 16 15:01:32 crc kubenswrapper[4775]: I1216 15:01:32.168004 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fwsw2"] Dec 16 15:01:32 crc kubenswrapper[4775]: I1216 15:01:32.334107 4775 generic.go:334] "Generic (PLEG): container finished" podID="f0e02474-416e-4434-b482-2df56ae4c6a7" containerID="0bf3341889a29cfdc13f39ef3ffe9dc41cd1758174c2c5ef23f67e256b4e7350" exitCode=0 Dec 16 15:01:32 crc kubenswrapper[4775]: I1216 15:01:32.334170 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj7n7" event={"ID":"f0e02474-416e-4434-b482-2df56ae4c6a7","Type":"ContainerDied","Data":"0bf3341889a29cfdc13f39ef3ffe9dc41cd1758174c2c5ef23f67e256b4e7350"} Dec 16 15:01:32 crc kubenswrapper[4775]: I1216 15:01:32.334200 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj7n7" event={"ID":"f0e02474-416e-4434-b482-2df56ae4c6a7","Type":"ContainerStarted","Data":"6c3da41871fb31447465411bc5aaa8f65ded6ff9ab5309b9ef5f56a1ffad3cfe"} Dec 16 15:01:32 crc kubenswrapper[4775]: I1216 15:01:32.339240 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzt2q" event={"ID":"2be58473-7d1b-4c58-a3a7-862cd4846f63","Type":"ContainerStarted","Data":"2ebafe6b84e9567482ce792aaeb5efe10452c088ad12d404a028fc83576fee0c"} Dec 16 15:01:32 crc kubenswrapper[4775]: I1216 15:01:32.341502 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwsw2" event={"ID":"37c67918-469b-4d46-aabb-63b96e941479","Type":"ContainerStarted","Data":"649cb7a98dae3aef919065063742c6377c5c41c21370fe1bba0bdece7563c82c"} Dec 16 15:01:32 crc kubenswrapper[4775]: I1216 15:01:32.868953 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:01:32 crc kubenswrapper[4775]: I1216 15:01:32.869239 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:01:32 crc kubenswrapper[4775]: I1216 15:01:32.869296 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 15:01:32 crc kubenswrapper[4775]: I1216 15:01:32.869986 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9e3465738032f654be4ca1d279ee3d3d38d1b27f93903e514f0b4016991fc071"} pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:01:32 crc kubenswrapper[4775]: I1216 15:01:32.870042 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" containerID="cri-o://9e3465738032f654be4ca1d279ee3d3d38d1b27f93903e514f0b4016991fc071" gracePeriod=600 Dec 16 15:01:33 crc kubenswrapper[4775]: I1216 15:01:33.347708 4775 generic.go:334] "Generic (PLEG): container finished" podID="2be58473-7d1b-4c58-a3a7-862cd4846f63" containerID="2ebafe6b84e9567482ce792aaeb5efe10452c088ad12d404a028fc83576fee0c" exitCode=0 Dec 16 15:01:33 crc kubenswrapper[4775]: I1216 15:01:33.347837 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzt2q" event={"ID":"2be58473-7d1b-4c58-a3a7-862cd4846f63","Type":"ContainerDied","Data":"2ebafe6b84e9567482ce792aaeb5efe10452c088ad12d404a028fc83576fee0c"} Dec 16 15:01:33 crc kubenswrapper[4775]: I1216 15:01:33.350271 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67zrm" event={"ID":"59df28e1-27a5-451d-9784-a30eba2a3dc0","Type":"ContainerStarted","Data":"59c849027a26ccffd6b29e14bea32357831e0c26fd74e09e41e1223789b6e9f7"} Dec 16 15:01:33 crc kubenswrapper[4775]: I1216 15:01:33.353250 4775 generic.go:334] "Generic (PLEG): container finished" podID="37c67918-469b-4d46-aabb-63b96e941479" containerID="83ed83200e3d2137e0acd5b693fd4514f79333d0dc2aba3c1e923296d173a190" exitCode=0 Dec 16 15:01:33 crc kubenswrapper[4775]: I1216 15:01:33.353320 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwsw2" event={"ID":"37c67918-469b-4d46-aabb-63b96e941479","Type":"ContainerDied","Data":"83ed83200e3d2137e0acd5b693fd4514f79333d0dc2aba3c1e923296d173a190"} Dec 16 15:01:33 crc kubenswrapper[4775]: I1216 15:01:33.359298 4775 generic.go:334] "Generic (PLEG): container finished" podID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerID="9e3465738032f654be4ca1d279ee3d3d38d1b27f93903e514f0b4016991fc071" exitCode=0 Dec 16 15:01:33 crc kubenswrapper[4775]: I1216 15:01:33.359368 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerDied","Data":"9e3465738032f654be4ca1d279ee3d3d38d1b27f93903e514f0b4016991fc071"} Dec 16 15:01:33 crc kubenswrapper[4775]: I1216 15:01:33.359412 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerStarted","Data":"3726a17a41d21de0c1144f1afb1120defbbe2b018d4ec48bc1ed4d607865dfc9"} Dec 16 15:01:33 crc kubenswrapper[4775]: I1216 15:01:33.359453 4775 scope.go:117] "RemoveContainer" containerID="e64266347ec3070cf81c73ff16e200ac01bcf0f83a3f98512304f9fdf4ea1d67" Dec 16 15:01:33 crc kubenswrapper[4775]: I1216 15:01:33.403702 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-67zrm" podStartSLOduration=3.292240756 podStartE2EDuration="5.40368661s" podCreationTimestamp="2025-12-16 15:01:28 +0000 UTC" firstStartedPulling="2025-12-16 15:01:30.300269973 +0000 UTC m=+415.251348896" lastFinishedPulling="2025-12-16 15:01:32.411715827 +0000 UTC m=+417.362794750" observedRunningTime="2025-12-16 15:01:33.401997716 +0000 UTC m=+418.353076659" watchObservedRunningTime="2025-12-16 15:01:33.40368661 +0000 UTC m=+418.354765533" Dec 16 15:01:34 crc kubenswrapper[4775]: I1216 15:01:34.366228 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzt2q" event={"ID":"2be58473-7d1b-4c58-a3a7-862cd4846f63","Type":"ContainerStarted","Data":"b59fbccccf16cf9ac1c30617552986c59d9ab3fd46ed3d6805f8a04393ac8ff5"} Dec 16 15:01:34 crc kubenswrapper[4775]: I1216 15:01:34.372539 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj7n7" event={"ID":"f0e02474-416e-4434-b482-2df56ae4c6a7","Type":"ContainerStarted","Data":"58ce6bf10e5159cdb3c6497881ffecda7670412018bbc5f1082315bb5640868a"} Dec 16 15:01:34 crc kubenswrapper[4775]: I1216 15:01:34.385881 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dzt2q" podStartSLOduration=2.7317785690000003 podStartE2EDuration="6.385862874s" podCreationTimestamp="2025-12-16 15:01:28 +0000 UTC" firstStartedPulling="2025-12-16 15:01:30.296125754 +0000 UTC m=+415.247204677" lastFinishedPulling="2025-12-16 15:01:33.950210039 +0000 UTC m=+418.901288982" observedRunningTime="2025-12-16 15:01:34.384313515 +0000 UTC m=+419.335392458" watchObservedRunningTime="2025-12-16 15:01:34.385862874 +0000 UTC m=+419.336941797" Dec 16 15:01:35 crc kubenswrapper[4775]: I1216 15:01:35.381099 4775 generic.go:334] "Generic (PLEG): container finished" podID="f0e02474-416e-4434-b482-2df56ae4c6a7" containerID="58ce6bf10e5159cdb3c6497881ffecda7670412018bbc5f1082315bb5640868a" exitCode=0 Dec 16 15:01:35 crc kubenswrapper[4775]: I1216 15:01:35.381543 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj7n7" event={"ID":"f0e02474-416e-4434-b482-2df56ae4c6a7","Type":"ContainerDied","Data":"58ce6bf10e5159cdb3c6497881ffecda7670412018bbc5f1082315bb5640868a"} Dec 16 15:01:35 crc kubenswrapper[4775]: I1216 15:01:35.397159 4775 generic.go:334] "Generic (PLEG): container finished" podID="37c67918-469b-4d46-aabb-63b96e941479" containerID="2e746a3b970f70f853b92eeff4627f706ddeba4e0e9a14aaa4500c01746ce59a" exitCode=0 Dec 16 15:01:35 crc kubenswrapper[4775]: I1216 15:01:35.398973 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwsw2" event={"ID":"37c67918-469b-4d46-aabb-63b96e941479","Type":"ContainerDied","Data":"2e746a3b970f70f853b92eeff4627f706ddeba4e0e9a14aaa4500c01746ce59a"} Dec 16 15:01:37 crc kubenswrapper[4775]: I1216 15:01:37.407713 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj7n7" event={"ID":"f0e02474-416e-4434-b482-2df56ae4c6a7","Type":"ContainerStarted","Data":"feb35b471f5dc44215765b93c63ac6cea02aa217582fc92aee78cc5074d69757"} Dec 16 15:01:37 crc kubenswrapper[4775]: I1216 15:01:37.410272 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwsw2" event={"ID":"37c67918-469b-4d46-aabb-63b96e941479","Type":"ContainerStarted","Data":"c30fd08a6d5641ce7f73dfe525b312093e2a6fd760d0dd2ce9f877c2b287f343"} Dec 16 15:01:37 crc kubenswrapper[4775]: I1216 15:01:37.444544 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vj7n7" podStartSLOduration=3.735665665 podStartE2EDuration="7.444522163s" podCreationTimestamp="2025-12-16 15:01:30 +0000 UTC" firstStartedPulling="2025-12-16 15:01:32.33611007 +0000 UTC m=+417.287188993" lastFinishedPulling="2025-12-16 15:01:36.044966568 +0000 UTC m=+420.996045491" observedRunningTime="2025-12-16 15:01:37.425185515 +0000 UTC m=+422.376264448" watchObservedRunningTime="2025-12-16 15:01:37.444522163 +0000 UTC m=+422.395601086" Dec 16 15:01:37 crc kubenswrapper[4775]: I1216 15:01:37.444815 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fwsw2" podStartSLOduration=3.938856258 podStartE2EDuration="6.444811212s" podCreationTimestamp="2025-12-16 15:01:31 +0000 UTC" firstStartedPulling="2025-12-16 15:01:33.355267588 +0000 UTC m=+418.306346511" lastFinishedPulling="2025-12-16 15:01:35.861222542 +0000 UTC m=+420.812301465" observedRunningTime="2025-12-16 15:01:37.441507288 +0000 UTC m=+422.392586221" watchObservedRunningTime="2025-12-16 15:01:37.444811212 +0000 UTC m=+422.395890135" Dec 16 15:01:38 crc kubenswrapper[4775]: I1216 15:01:38.590713 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dzt2q" Dec 16 15:01:38 crc kubenswrapper[4775]: I1216 15:01:38.591993 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dzt2q" Dec 16 15:01:38 crc kubenswrapper[4775]: I1216 15:01:38.664841 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dzt2q" Dec 16 15:01:39 crc kubenswrapper[4775]: I1216 15:01:39.169560 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-67zrm" Dec 16 15:01:39 crc kubenswrapper[4775]: I1216 15:01:39.169651 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-67zrm" Dec 16 15:01:39 crc kubenswrapper[4775]: I1216 15:01:39.187218 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7r67s" Dec 16 15:01:39 crc kubenswrapper[4775]: I1216 15:01:39.232739 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-67zrm" Dec 16 15:01:39 crc kubenswrapper[4775]: I1216 15:01:39.245641 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sxl49"] Dec 16 15:01:39 crc kubenswrapper[4775]: I1216 15:01:39.477756 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-67zrm" Dec 16 15:01:39 crc kubenswrapper[4775]: I1216 15:01:39.480115 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dzt2q" Dec 16 15:01:41 crc kubenswrapper[4775]: I1216 15:01:41.168085 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vj7n7" Dec 16 15:01:41 crc kubenswrapper[4775]: I1216 15:01:41.168492 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vj7n7" Dec 16 15:01:41 crc kubenswrapper[4775]: I1216 15:01:41.766880 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fwsw2" Dec 16 15:01:41 crc kubenswrapper[4775]: I1216 15:01:41.767250 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fwsw2" Dec 16 15:01:41 crc kubenswrapper[4775]: I1216 15:01:41.812939 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fwsw2" Dec 16 15:01:42 crc kubenswrapper[4775]: I1216 15:01:42.202943 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vj7n7" podUID="f0e02474-416e-4434-b482-2df56ae4c6a7" containerName="registry-server" probeResult="failure" output=< Dec 16 15:01:42 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Dec 16 15:01:42 crc kubenswrapper[4775]: > Dec 16 15:01:42 crc kubenswrapper[4775]: I1216 15:01:42.482819 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fwsw2" Dec 16 15:01:51 crc kubenswrapper[4775]: I1216 15:01:51.209279 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vj7n7" Dec 16 15:01:51 crc kubenswrapper[4775]: I1216 15:01:51.253178 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vj7n7" Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.283921 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" podUID="55dc0f62-62c4-48e2-9eb9-4998ad616e7f" containerName="registry" containerID="cri-o://c9832c81b687b5b6bba176a28f175b833d7aed51a628d7d2594e3aa9550a03e0" gracePeriod=30 Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.566151 4775 generic.go:334] "Generic (PLEG): container finished" podID="55dc0f62-62c4-48e2-9eb9-4998ad616e7f" containerID="c9832c81b687b5b6bba176a28f175b833d7aed51a628d7d2594e3aa9550a03e0" exitCode=0 Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.566199 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" event={"ID":"55dc0f62-62c4-48e2-9eb9-4998ad616e7f","Type":"ContainerDied","Data":"c9832c81b687b5b6bba176a28f175b833d7aed51a628d7d2594e3aa9550a03e0"} Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.704388 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.741496 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-ca-trust-extracted\") pod \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.741776 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.741858 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-installation-pull-secrets\") pod \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.742959 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-registry-certificates\") pod \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.743215 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-registry-tls\") pod \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.743265 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j94w\" (UniqueName: \"kubernetes.io/projected/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-kube-api-access-5j94w\") pod \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.743373 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-trusted-ca\") pod \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.743412 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-bound-sa-token\") pod \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\" (UID: \"55dc0f62-62c4-48e2-9eb9-4998ad616e7f\") " Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.744299 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "55dc0f62-62c4-48e2-9eb9-4998ad616e7f" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.744537 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "55dc0f62-62c4-48e2-9eb9-4998ad616e7f" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.750933 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "55dc0f62-62c4-48e2-9eb9-4998ad616e7f" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.754033 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-kube-api-access-5j94w" (OuterVolumeSpecName: "kube-api-access-5j94w") pod "55dc0f62-62c4-48e2-9eb9-4998ad616e7f" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f"). InnerVolumeSpecName "kube-api-access-5j94w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.755024 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "55dc0f62-62c4-48e2-9eb9-4998ad616e7f" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.755916 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "55dc0f62-62c4-48e2-9eb9-4998ad616e7f" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.757321 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "55dc0f62-62c4-48e2-9eb9-4998ad616e7f" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.775448 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "55dc0f62-62c4-48e2-9eb9-4998ad616e7f" (UID: "55dc0f62-62c4-48e2-9eb9-4998ad616e7f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.845008 4775 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.845042 4775 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.845052 4775 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.845062 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j94w\" (UniqueName: \"kubernetes.io/projected/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-kube-api-access-5j94w\") on node \"crc\" DevicePath \"\"" Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.845071 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.845081 4775 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 15:02:04 crc kubenswrapper[4775]: I1216 15:02:04.845089 4775 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/55dc0f62-62c4-48e2-9eb9-4998ad616e7f-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 16 15:02:05 crc kubenswrapper[4775]: I1216 15:02:05.572232 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" event={"ID":"55dc0f62-62c4-48e2-9eb9-4998ad616e7f","Type":"ContainerDied","Data":"2ea92d2458be6f4dbd64fd9c099f685de6ce1472f38b3b3f40bc8663695432d3"} Dec 16 15:02:05 crc kubenswrapper[4775]: I1216 15:02:05.572288 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sxl49" Dec 16 15:02:05 crc kubenswrapper[4775]: I1216 15:02:05.572300 4775 scope.go:117] "RemoveContainer" containerID="c9832c81b687b5b6bba176a28f175b833d7aed51a628d7d2594e3aa9550a03e0" Dec 16 15:02:05 crc kubenswrapper[4775]: I1216 15:02:05.595660 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sxl49"] Dec 16 15:02:05 crc kubenswrapper[4775]: I1216 15:02:05.602002 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sxl49"] Dec 16 15:02:07 crc kubenswrapper[4775]: I1216 15:02:07.348557 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55dc0f62-62c4-48e2-9eb9-4998ad616e7f" path="/var/lib/kubelet/pods/55dc0f62-62c4-48e2-9eb9-4998ad616e7f/volumes" Dec 16 15:04:02 crc kubenswrapper[4775]: I1216 15:04:02.868992 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:04:02 crc kubenswrapper[4775]: I1216 15:04:02.869630 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:04:32 crc kubenswrapper[4775]: I1216 15:04:32.869683 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:04:32 crc kubenswrapper[4775]: I1216 15:04:32.870346 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:05:02 crc kubenswrapper[4775]: I1216 15:05:02.868664 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:05:02 crc kubenswrapper[4775]: I1216 15:05:02.869154 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:05:02 crc kubenswrapper[4775]: I1216 15:05:02.869199 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 15:05:02 crc kubenswrapper[4775]: I1216 15:05:02.869776 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3726a17a41d21de0c1144f1afb1120defbbe2b018d4ec48bc1ed4d607865dfc9"} pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:05:02 crc kubenswrapper[4775]: I1216 15:05:02.869830 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" containerID="cri-o://3726a17a41d21de0c1144f1afb1120defbbe2b018d4ec48bc1ed4d607865dfc9" gracePeriod=600 Dec 16 15:05:03 crc kubenswrapper[4775]: I1216 15:05:03.696121 4775 generic.go:334] "Generic (PLEG): container finished" podID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerID="3726a17a41d21de0c1144f1afb1120defbbe2b018d4ec48bc1ed4d607865dfc9" exitCode=0 Dec 16 15:05:03 crc kubenswrapper[4775]: I1216 15:05:03.696167 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerDied","Data":"3726a17a41d21de0c1144f1afb1120defbbe2b018d4ec48bc1ed4d607865dfc9"} Dec 16 15:05:03 crc kubenswrapper[4775]: I1216 15:05:03.696474 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerStarted","Data":"790666d10a8413c7b1bed65625e744b82eacfed0c75d107b7bd78a845e4df70e"} Dec 16 15:05:03 crc kubenswrapper[4775]: I1216 15:05:03.696498 4775 scope.go:117] "RemoveContainer" containerID="9e3465738032f654be4ca1d279ee3d3d38d1b27f93903e514f0b4016991fc071" Dec 16 15:06:35 crc kubenswrapper[4775]: I1216 15:06:35.636224 4775 scope.go:117] "RemoveContainer" containerID="16fd6250eaf3040b1123085ef0b34df2a73ce6b6de76af294d1218391f61f88c" Dec 16 15:07:26 crc kubenswrapper[4775]: I1216 15:07:26.023292 4775 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 16 15:07:29 crc kubenswrapper[4775]: I1216 15:07:29.907334 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-s2cqf"] Dec 16 15:07:29 crc kubenswrapper[4775]: E1216 15:07:29.907978 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55dc0f62-62c4-48e2-9eb9-4998ad616e7f" containerName="registry" Dec 16 15:07:29 crc kubenswrapper[4775]: I1216 15:07:29.907998 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="55dc0f62-62c4-48e2-9eb9-4998ad616e7f" containerName="registry" Dec 16 15:07:29 crc kubenswrapper[4775]: I1216 15:07:29.908141 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="55dc0f62-62c4-48e2-9eb9-4998ad616e7f" containerName="registry" Dec 16 15:07:29 crc kubenswrapper[4775]: I1216 15:07:29.908721 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-s2cqf" Dec 16 15:07:29 crc kubenswrapper[4775]: I1216 15:07:29.910503 4775 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-pqp6g" Dec 16 15:07:29 crc kubenswrapper[4775]: I1216 15:07:29.910763 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 16 15:07:29 crc kubenswrapper[4775]: I1216 15:07:29.911123 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 16 15:07:29 crc kubenswrapper[4775]: I1216 15:07:29.917972 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-s2cqf"] Dec 16 15:07:29 crc kubenswrapper[4775]: I1216 15:07:29.922941 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-gxr9j"] Dec 16 15:07:29 crc kubenswrapper[4775]: I1216 15:07:29.923647 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-gxr9j" Dec 16 15:07:29 crc kubenswrapper[4775]: I1216 15:07:29.925707 4775 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-4vjrb" Dec 16 15:07:29 crc kubenswrapper[4775]: I1216 15:07:29.931028 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-kh9z9"] Dec 16 15:07:29 crc kubenswrapper[4775]: I1216 15:07:29.948189 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-kh9z9" Dec 16 15:07:29 crc kubenswrapper[4775]: I1216 15:07:29.950942 4775 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-j9llj" Dec 16 15:07:29 crc kubenswrapper[4775]: I1216 15:07:29.965764 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-gxr9j"] Dec 16 15:07:29 crc kubenswrapper[4775]: I1216 15:07:29.970327 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-kh9z9"] Dec 16 15:07:29 crc kubenswrapper[4775]: I1216 15:07:29.988033 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brw9n\" (UniqueName: \"kubernetes.io/projected/4fc14e4b-fa58-41f3-b5b4-f27d75e6a294-kube-api-access-brw9n\") pod \"cert-manager-5b446d88c5-kh9z9\" (UID: \"4fc14e4b-fa58-41f3-b5b4-f27d75e6a294\") " pod="cert-manager/cert-manager-5b446d88c5-kh9z9" Dec 16 15:07:30 crc kubenswrapper[4775]: I1216 15:07:30.088977 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9bsv\" (UniqueName: \"kubernetes.io/projected/5aa53da3-90be-4e8d-874f-817fce504026-kube-api-access-l9bsv\") pod \"cert-manager-cainjector-7f985d654d-s2cqf\" (UID: \"5aa53da3-90be-4e8d-874f-817fce504026\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-s2cqf" Dec 16 15:07:30 crc kubenswrapper[4775]: I1216 15:07:30.089059 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcg78\" (UniqueName: \"kubernetes.io/projected/370c0803-3050-431b-82e2-d3d69f5d386f-kube-api-access-gcg78\") pod \"cert-manager-webhook-5655c58dd6-gxr9j\" (UID: \"370c0803-3050-431b-82e2-d3d69f5d386f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-gxr9j" Dec 16 15:07:30 crc kubenswrapper[4775]: I1216 15:07:30.089121 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brw9n\" (UniqueName: \"kubernetes.io/projected/4fc14e4b-fa58-41f3-b5b4-f27d75e6a294-kube-api-access-brw9n\") pod \"cert-manager-5b446d88c5-kh9z9\" (UID: \"4fc14e4b-fa58-41f3-b5b4-f27d75e6a294\") " pod="cert-manager/cert-manager-5b446d88c5-kh9z9" Dec 16 15:07:30 crc kubenswrapper[4775]: I1216 15:07:30.110206 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brw9n\" (UniqueName: \"kubernetes.io/projected/4fc14e4b-fa58-41f3-b5b4-f27d75e6a294-kube-api-access-brw9n\") pod \"cert-manager-5b446d88c5-kh9z9\" (UID: \"4fc14e4b-fa58-41f3-b5b4-f27d75e6a294\") " pod="cert-manager/cert-manager-5b446d88c5-kh9z9" Dec 16 15:07:30 crc kubenswrapper[4775]: I1216 15:07:30.189860 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9bsv\" (UniqueName: \"kubernetes.io/projected/5aa53da3-90be-4e8d-874f-817fce504026-kube-api-access-l9bsv\") pod \"cert-manager-cainjector-7f985d654d-s2cqf\" (UID: \"5aa53da3-90be-4e8d-874f-817fce504026\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-s2cqf" Dec 16 15:07:30 crc kubenswrapper[4775]: I1216 15:07:30.189953 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcg78\" (UniqueName: \"kubernetes.io/projected/370c0803-3050-431b-82e2-d3d69f5d386f-kube-api-access-gcg78\") pod \"cert-manager-webhook-5655c58dd6-gxr9j\" (UID: \"370c0803-3050-431b-82e2-d3d69f5d386f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-gxr9j" Dec 16 15:07:30 crc kubenswrapper[4775]: I1216 15:07:30.207341 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9bsv\" (UniqueName: \"kubernetes.io/projected/5aa53da3-90be-4e8d-874f-817fce504026-kube-api-access-l9bsv\") pod \"cert-manager-cainjector-7f985d654d-s2cqf\" (UID: \"5aa53da3-90be-4e8d-874f-817fce504026\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-s2cqf" Dec 16 15:07:30 crc kubenswrapper[4775]: I1216 15:07:30.208095 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcg78\" (UniqueName: \"kubernetes.io/projected/370c0803-3050-431b-82e2-d3d69f5d386f-kube-api-access-gcg78\") pod \"cert-manager-webhook-5655c58dd6-gxr9j\" (UID: \"370c0803-3050-431b-82e2-d3d69f5d386f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-gxr9j" Dec 16 15:07:30 crc kubenswrapper[4775]: I1216 15:07:30.229235 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-s2cqf" Dec 16 15:07:30 crc kubenswrapper[4775]: I1216 15:07:30.254556 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-gxr9j" Dec 16 15:07:30 crc kubenswrapper[4775]: I1216 15:07:30.267836 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-kh9z9" Dec 16 15:07:30 crc kubenswrapper[4775]: I1216 15:07:30.505851 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-kh9z9"] Dec 16 15:07:30 crc kubenswrapper[4775]: I1216 15:07:30.522300 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 15:07:30 crc kubenswrapper[4775]: I1216 15:07:30.539585 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-gxr9j"] Dec 16 15:07:30 crc kubenswrapper[4775]: W1216 15:07:30.545712 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod370c0803_3050_431b_82e2_d3d69f5d386f.slice/crio-3d70d07267cc57ab410528053c38d2f89bcae79a3e8972d8b23a56bdcdcc685b WatchSource:0}: Error finding container 3d70d07267cc57ab410528053c38d2f89bcae79a3e8972d8b23a56bdcdcc685b: Status 404 returned error can't find the container with id 3d70d07267cc57ab410528053c38d2f89bcae79a3e8972d8b23a56bdcdcc685b Dec 16 15:07:30 crc kubenswrapper[4775]: I1216 15:07:30.648784 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-s2cqf"] Dec 16 15:07:30 crc kubenswrapper[4775]: W1216 15:07:30.652192 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5aa53da3_90be_4e8d_874f_817fce504026.slice/crio-2082b41ae3452d4db04e39d4cc4f554a66948a785252e36b4b398aa8792bccd4 WatchSource:0}: Error finding container 2082b41ae3452d4db04e39d4cc4f554a66948a785252e36b4b398aa8792bccd4: Status 404 returned error can't find the container with id 2082b41ae3452d4db04e39d4cc4f554a66948a785252e36b4b398aa8792bccd4 Dec 16 15:07:31 crc kubenswrapper[4775]: I1216 15:07:31.514761 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-kh9z9" event={"ID":"4fc14e4b-fa58-41f3-b5b4-f27d75e6a294","Type":"ContainerStarted","Data":"3123a076b060a56c15dedbf2000c2529979420d39965dd7853654b452a3de72c"} Dec 16 15:07:31 crc kubenswrapper[4775]: I1216 15:07:31.517538 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-s2cqf" event={"ID":"5aa53da3-90be-4e8d-874f-817fce504026","Type":"ContainerStarted","Data":"2082b41ae3452d4db04e39d4cc4f554a66948a785252e36b4b398aa8792bccd4"} Dec 16 15:07:31 crc kubenswrapper[4775]: I1216 15:07:31.519075 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-gxr9j" event={"ID":"370c0803-3050-431b-82e2-d3d69f5d386f","Type":"ContainerStarted","Data":"3d70d07267cc57ab410528053c38d2f89bcae79a3e8972d8b23a56bdcdcc685b"} Dec 16 15:07:32 crc kubenswrapper[4775]: I1216 15:07:32.868640 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:07:32 crc kubenswrapper[4775]: I1216 15:07:32.868701 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:07:34 crc kubenswrapper[4775]: I1216 15:07:34.544234 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-gxr9j" event={"ID":"370c0803-3050-431b-82e2-d3d69f5d386f","Type":"ContainerStarted","Data":"40b7596a6a17837ccb30d50ad5e54d7c48464581943d665f65af551f048f1966"} Dec 16 15:07:34 crc kubenswrapper[4775]: I1216 15:07:34.544603 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-gxr9j" Dec 16 15:07:34 crc kubenswrapper[4775]: I1216 15:07:34.547610 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-kh9z9" event={"ID":"4fc14e4b-fa58-41f3-b5b4-f27d75e6a294","Type":"ContainerStarted","Data":"36701fa03fda4fe41202eacda77a752f0409d70c1ceefdc7c9c8b06be9670786"} Dec 16 15:07:34 crc kubenswrapper[4775]: I1216 15:07:34.569617 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-gxr9j" podStartSLOduration=2.663826353 podStartE2EDuration="5.569587676s" podCreationTimestamp="2025-12-16 15:07:29 +0000 UTC" firstStartedPulling="2025-12-16 15:07:30.547961344 +0000 UTC m=+775.499040267" lastFinishedPulling="2025-12-16 15:07:33.453722667 +0000 UTC m=+778.404801590" observedRunningTime="2025-12-16 15:07:34.561978236 +0000 UTC m=+779.513084630" watchObservedRunningTime="2025-12-16 15:07:34.569587676 +0000 UTC m=+779.520666619" Dec 16 15:07:34 crc kubenswrapper[4775]: I1216 15:07:34.586703 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-kh9z9" podStartSLOduration=2.649347176 podStartE2EDuration="5.586676196s" podCreationTimestamp="2025-12-16 15:07:29 +0000 UTC" firstStartedPulling="2025-12-16 15:07:30.522081266 +0000 UTC m=+775.473160189" lastFinishedPulling="2025-12-16 15:07:33.459410286 +0000 UTC m=+778.410489209" observedRunningTime="2025-12-16 15:07:34.580687117 +0000 UTC m=+779.531766090" watchObservedRunningTime="2025-12-16 15:07:34.586676196 +0000 UTC m=+779.537755129" Dec 16 15:07:35 crc kubenswrapper[4775]: I1216 15:07:35.554380 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-s2cqf" event={"ID":"5aa53da3-90be-4e8d-874f-817fce504026","Type":"ContainerStarted","Data":"386df91765a32b48c7de3e7d39f22cf8d585d67c84219d3a1aa28a1845dc374e"} Dec 16 15:07:35 crc kubenswrapper[4775]: I1216 15:07:35.580319 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-s2cqf" podStartSLOduration=2.863160961 podStartE2EDuration="6.580293064s" podCreationTimestamp="2025-12-16 15:07:29 +0000 UTC" firstStartedPulling="2025-12-16 15:07:30.654016434 +0000 UTC m=+775.605095347" lastFinishedPulling="2025-12-16 15:07:34.371148517 +0000 UTC m=+779.322227450" observedRunningTime="2025-12-16 15:07:35.576177645 +0000 UTC m=+780.527256598" watchObservedRunningTime="2025-12-16 15:07:35.580293064 +0000 UTC m=+780.531372007" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.258628 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-gxr9j" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.552954 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-79w7z"] Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.553628 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovn-controller" containerID="cri-o://51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814" gracePeriod=30 Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.554116 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="sbdb" containerID="cri-o://84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3" gracePeriod=30 Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.554250 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="nbdb" containerID="cri-o://4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c" gracePeriod=30 Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.554352 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="northd" containerID="cri-o://c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583" gracePeriod=30 Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.554446 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683" gracePeriod=30 Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.554539 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="kube-rbac-proxy-node" containerID="cri-o://84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555" gracePeriod=30 Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.554635 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovn-acl-logging" containerID="cri-o://edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c" gracePeriod=30 Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.600711 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovnkube-controller" containerID="cri-o://bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7" gracePeriod=30 Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.831490 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovnkube-controller/3.log" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.833143 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovn-acl-logging/0.log" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.833469 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovn-controller/0.log" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.833801 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.893377 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7ptdp"] Dec 16 15:07:40 crc kubenswrapper[4775]: E1216 15:07:40.893827 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovnkube-controller" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.893849 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovnkube-controller" Dec 16 15:07:40 crc kubenswrapper[4775]: E1216 15:07:40.893871 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="kubecfg-setup" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.893879 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="kubecfg-setup" Dec 16 15:07:40 crc kubenswrapper[4775]: E1216 15:07:40.893916 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovnkube-controller" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.893925 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovnkube-controller" Dec 16 15:07:40 crc kubenswrapper[4775]: E1216 15:07:40.893935 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovnkube-controller" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.893943 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovnkube-controller" Dec 16 15:07:40 crc kubenswrapper[4775]: E1216 15:07:40.893956 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="kube-rbac-proxy-ovn-metrics" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.893965 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="kube-rbac-proxy-ovn-metrics" Dec 16 15:07:40 crc kubenswrapper[4775]: E1216 15:07:40.893980 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovn-acl-logging" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.893990 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovn-acl-logging" Dec 16 15:07:40 crc kubenswrapper[4775]: E1216 15:07:40.894010 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovn-controller" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.894018 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovn-controller" Dec 16 15:07:40 crc kubenswrapper[4775]: E1216 15:07:40.894029 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="sbdb" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.894037 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="sbdb" Dec 16 15:07:40 crc kubenswrapper[4775]: E1216 15:07:40.894049 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="kube-rbac-proxy-node" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.894057 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="kube-rbac-proxy-node" Dec 16 15:07:40 crc kubenswrapper[4775]: E1216 15:07:40.894072 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="nbdb" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.894080 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="nbdb" Dec 16 15:07:40 crc kubenswrapper[4775]: E1216 15:07:40.894093 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="northd" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.894102 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="northd" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.894256 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovn-acl-logging" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.894267 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovnkube-controller" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.894277 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="northd" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.894287 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="kube-rbac-proxy-ovn-metrics" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.894299 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovnkube-controller" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.894311 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="sbdb" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.894320 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovnkube-controller" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.894328 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovn-controller" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.894337 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="kube-rbac-proxy-node" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.894346 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="nbdb" Dec 16 15:07:40 crc kubenswrapper[4775]: E1216 15:07:40.894479 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovnkube-controller" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.894491 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovnkube-controller" Dec 16 15:07:40 crc kubenswrapper[4775]: E1216 15:07:40.894501 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovnkube-controller" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.894509 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovnkube-controller" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.894662 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovnkube-controller" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.895097 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerName="ovnkube-controller" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.897234 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.943923 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcjwq\" (UniqueName: \"kubernetes.io/projected/524488dd-74ee-43ea-ac0f-5e04d59af434-kube-api-access-gcjwq\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.943965 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-etc-openvswitch\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944021 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-run-openvswitch\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944045 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-var-lib-openvswitch\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944066 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-cni-netd\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944099 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-run-systemd\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944126 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-slash\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944119 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944154 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/524488dd-74ee-43ea-ac0f-5e04d59af434-env-overrides\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944154 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944119 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944202 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-slash" (OuterVolumeSpecName: "host-slash") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944221 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-run-netns\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944240 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-node-log\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944263 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944300 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944350 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-systemd-units\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944398 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-node-log" (OuterVolumeSpecName: "node-log") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944407 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944371 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-run-ovn-kubernetes\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944466 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-run-ovn\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944478 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944533 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/524488dd-74ee-43ea-ac0f-5e04d59af434-ovnkube-config\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944595 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/524488dd-74ee-43ea-ac0f-5e04d59af434-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944571 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944644 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-log-socket\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944688 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-var-lib-cni-networks-ovn-kubernetes\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944716 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/524488dd-74ee-43ea-ac0f-5e04d59af434-ovnkube-script-lib\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944770 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-cni-bin\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944716 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-log-socket" (OuterVolumeSpecName: "log-socket") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944739 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944839 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-kubelet\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944847 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944909 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/524488dd-74ee-43ea-ac0f-5e04d59af434-ovn-node-metrics-cert\") pod \"524488dd-74ee-43ea-ac0f-5e04d59af434\" (UID: \"524488dd-74ee-43ea-ac0f-5e04d59af434\") " Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.944921 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.945131 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/524488dd-74ee-43ea-ac0f-5e04d59af434-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.945453 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/524488dd-74ee-43ea-ac0f-5e04d59af434-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.945500 4775 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.945578 4775 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-node-log\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.945611 4775 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.945637 4775 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.945713 4775 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.945757 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/524488dd-74ee-43ea-ac0f-5e04d59af434-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.945794 4775 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-log-socket\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.945830 4775 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.945852 4775 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.945902 4775 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.945933 4775 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.945959 4775 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.945988 4775 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.946060 4775 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.946119 4775 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-host-slash\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.946150 4775 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/524488dd-74ee-43ea-ac0f-5e04d59af434-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.948840 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/524488dd-74ee-43ea-ac0f-5e04d59af434-kube-api-access-gcjwq" (OuterVolumeSpecName: "kube-api-access-gcjwq") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "kube-api-access-gcjwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.949730 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524488dd-74ee-43ea-ac0f-5e04d59af434-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:07:40 crc kubenswrapper[4775]: I1216 15:07:40.956788 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "524488dd-74ee-43ea-ac0f-5e04d59af434" (UID: "524488dd-74ee-43ea-ac0f-5e04d59af434"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.048627 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.048722 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-run-systemd\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.048772 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-ovnkube-config\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.048809 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-env-overrides\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.048842 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-systemd-units\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.048872 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-run-netns\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.048951 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-run-ovn-kubernetes\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.049011 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zt9d\" (UniqueName: \"kubernetes.io/projected/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-kube-api-access-5zt9d\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.049116 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-etc-openvswitch\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.049167 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-ovnkube-script-lib\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.049204 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-kubelet\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.049345 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-cni-bin\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.049436 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-cni-netd\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.049473 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-node-log\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.049521 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-log-socket\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.049553 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-var-lib-openvswitch\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.049622 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-slash\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.049660 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-run-ovn\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.049725 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-run-openvswitch\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.049757 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-ovn-node-metrics-cert\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.049838 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/524488dd-74ee-43ea-ac0f-5e04d59af434-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.049863 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/524488dd-74ee-43ea-ac0f-5e04d59af434-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.049914 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcjwq\" (UniqueName: \"kubernetes.io/projected/524488dd-74ee-43ea-ac0f-5e04d59af434-kube-api-access-gcjwq\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.049960 4775 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/524488dd-74ee-43ea-ac0f-5e04d59af434-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.151526 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-run-systemd\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.151582 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-ovnkube-config\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.151607 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-env-overrides\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.151626 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-systemd-units\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.151647 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-run-netns\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.151667 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-run-ovn-kubernetes\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.151704 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zt9d\" (UniqueName: \"kubernetes.io/projected/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-kube-api-access-5zt9d\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.151724 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-etc-openvswitch\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.151748 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-ovnkube-script-lib\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.151772 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-kubelet\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.151804 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-cni-bin\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.151824 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-cni-netd\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.151843 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-node-log\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.151866 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-log-socket\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.151908 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-var-lib-openvswitch\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.151933 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-slash\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.151955 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-run-ovn\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.151988 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-run-openvswitch\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.152010 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-ovn-node-metrics-cert\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.152034 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.152618 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-slash\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.152704 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-cni-bin\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.152756 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-cni-netd\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.152820 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-node-log\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.152865 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-log-socket\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.152959 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-var-lib-openvswitch\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.153661 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-run-ovn\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.153794 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-run-netns\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.153863 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-run-systemd\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.155164 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-ovnkube-config\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.155284 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.155408 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-run-ovn-kubernetes\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.155300 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-systemd-units\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.155557 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-etc-openvswitch\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.155933 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-run-openvswitch\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.156025 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-host-kubelet\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.156517 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-env-overrides\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.158710 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-ovn-node-metrics-cert\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.160133 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-ovnkube-script-lib\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.176212 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zt9d\" (UniqueName: \"kubernetes.io/projected/4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0-kube-api-access-5zt9d\") pod \"ovnkube-node-7ptdp\" (UID: \"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.213531 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.598229 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" event={"ID":"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0","Type":"ContainerStarted","Data":"924a0d29802df3d0e311ae6e0198441933c324f597fb3c20c075a5e8e8fa74e6"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.601251 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc2lg_f108f76f-c79a-42b0-b5ac-714d49d9a4d5/kube-multus/2.log" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.602133 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc2lg_f108f76f-c79a-42b0-b5ac-714d49d9a4d5/kube-multus/1.log" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.602202 4775 generic.go:334] "Generic (PLEG): container finished" podID="f108f76f-c79a-42b0-b5ac-714d49d9a4d5" containerID="bd5bf8d9aa860c638df224881d6e2c78b66ea54d6e2e871aebcdf55ac2dc99ce" exitCode=2 Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.602293 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mc2lg" event={"ID":"f108f76f-c79a-42b0-b5ac-714d49d9a4d5","Type":"ContainerDied","Data":"bd5bf8d9aa860c638df224881d6e2c78b66ea54d6e2e871aebcdf55ac2dc99ce"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.602350 4775 scope.go:117] "RemoveContainer" containerID="df66b9c818cf970df880bf19cf5d511f23a4ff7bebd59e241339dd26e0ac8fa0" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.603270 4775 scope.go:117] "RemoveContainer" containerID="bd5bf8d9aa860c638df224881d6e2c78b66ea54d6e2e871aebcdf55ac2dc99ce" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.608151 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovnkube-controller/3.log" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.613094 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovn-acl-logging/0.log" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.614758 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79w7z_524488dd-74ee-43ea-ac0f-5e04d59af434/ovn-controller/0.log" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615188 4775 generic.go:334] "Generic (PLEG): container finished" podID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerID="bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7" exitCode=0 Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615227 4775 generic.go:334] "Generic (PLEG): container finished" podID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerID="84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3" exitCode=0 Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615240 4775 generic.go:334] "Generic (PLEG): container finished" podID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerID="4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c" exitCode=0 Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615252 4775 generic.go:334] "Generic (PLEG): container finished" podID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerID="c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583" exitCode=0 Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615262 4775 generic.go:334] "Generic (PLEG): container finished" podID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerID="e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683" exitCode=0 Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615271 4775 generic.go:334] "Generic (PLEG): container finished" podID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerID="84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555" exitCode=0 Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615279 4775 generic.go:334] "Generic (PLEG): container finished" podID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerID="edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c" exitCode=143 Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615290 4775 generic.go:334] "Generic (PLEG): container finished" podID="524488dd-74ee-43ea-ac0f-5e04d59af434" containerID="51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814" exitCode=143 Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615277 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerDied","Data":"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615320 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615339 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerDied","Data":"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615364 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerDied","Data":"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615383 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerDied","Data":"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615401 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerDied","Data":"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615420 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerDied","Data":"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615438 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615456 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615466 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615476 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615486 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615496 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615505 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615515 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615524 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615533 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615549 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerDied","Data":"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615568 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615579 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615589 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615598 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615607 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615617 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615659 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615669 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615678 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615687 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615702 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerDied","Data":"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615718 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615728 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615740 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615750 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615760 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615770 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615780 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615792 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615802 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615813 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615826 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79w7z" event={"ID":"524488dd-74ee-43ea-ac0f-5e04d59af434","Type":"ContainerDied","Data":"2ef64c25ce5ae2d4b03af0088361e90d21c0e774f0b7e35e863b36c08e80df16"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615842 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615853 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615863 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615873 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615903 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615914 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615923 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615932 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615942 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.615952 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79"} Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.651221 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-79w7z"] Dec 16 15:07:41 crc kubenswrapper[4775]: I1216 15:07:41.656747 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-79w7z"] Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.260920 4775 scope.go:117] "RemoveContainer" containerID="bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.279226 4775 scope.go:117] "RemoveContainer" containerID="cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.296854 4775 scope.go:117] "RemoveContainer" containerID="84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.311728 4775 scope.go:117] "RemoveContainer" containerID="4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.324857 4775 scope.go:117] "RemoveContainer" containerID="c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.338775 4775 scope.go:117] "RemoveContainer" containerID="e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.374874 4775 scope.go:117] "RemoveContainer" containerID="84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.387157 4775 scope.go:117] "RemoveContainer" containerID="edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.401904 4775 scope.go:117] "RemoveContainer" containerID="51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.418696 4775 scope.go:117] "RemoveContainer" containerID="fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.431850 4775 scope.go:117] "RemoveContainer" containerID="bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7" Dec 16 15:07:42 crc kubenswrapper[4775]: E1216 15:07:42.432400 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7\": container with ID starting with bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7 not found: ID does not exist" containerID="bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.432488 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7"} err="failed to get container status \"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7\": rpc error: code = NotFound desc = could not find container \"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7\": container with ID starting with bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.432522 4775 scope.go:117] "RemoveContainer" containerID="cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee" Dec 16 15:07:42 crc kubenswrapper[4775]: E1216 15:07:42.432919 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee\": container with ID starting with cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee not found: ID does not exist" containerID="cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.432957 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee"} err="failed to get container status \"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee\": rpc error: code = NotFound desc = could not find container \"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee\": container with ID starting with cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.432972 4775 scope.go:117] "RemoveContainer" containerID="84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3" Dec 16 15:07:42 crc kubenswrapper[4775]: E1216 15:07:42.433310 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\": container with ID starting with 84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3 not found: ID does not exist" containerID="84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.433351 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3"} err="failed to get container status \"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\": rpc error: code = NotFound desc = could not find container \"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\": container with ID starting with 84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.433380 4775 scope.go:117] "RemoveContainer" containerID="4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c" Dec 16 15:07:42 crc kubenswrapper[4775]: E1216 15:07:42.433682 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\": container with ID starting with 4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c not found: ID does not exist" containerID="4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.433722 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c"} err="failed to get container status \"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\": rpc error: code = NotFound desc = could not find container \"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\": container with ID starting with 4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.433743 4775 scope.go:117] "RemoveContainer" containerID="c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583" Dec 16 15:07:42 crc kubenswrapper[4775]: E1216 15:07:42.434419 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\": container with ID starting with c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583 not found: ID does not exist" containerID="c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.434451 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583"} err="failed to get container status \"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\": rpc error: code = NotFound desc = could not find container \"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\": container with ID starting with c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.434470 4775 scope.go:117] "RemoveContainer" containerID="e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683" Dec 16 15:07:42 crc kubenswrapper[4775]: E1216 15:07:42.434767 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\": container with ID starting with e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683 not found: ID does not exist" containerID="e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.434805 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683"} err="failed to get container status \"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\": rpc error: code = NotFound desc = could not find container \"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\": container with ID starting with e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.434829 4775 scope.go:117] "RemoveContainer" containerID="84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555" Dec 16 15:07:42 crc kubenswrapper[4775]: E1216 15:07:42.435150 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\": container with ID starting with 84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555 not found: ID does not exist" containerID="84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.435181 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555"} err="failed to get container status \"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\": rpc error: code = NotFound desc = could not find container \"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\": container with ID starting with 84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.435221 4775 scope.go:117] "RemoveContainer" containerID="edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c" Dec 16 15:07:42 crc kubenswrapper[4775]: E1216 15:07:42.435544 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\": container with ID starting with edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c not found: ID does not exist" containerID="edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.435571 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c"} err="failed to get container status \"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\": rpc error: code = NotFound desc = could not find container \"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\": container with ID starting with edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.435617 4775 scope.go:117] "RemoveContainer" containerID="51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814" Dec 16 15:07:42 crc kubenswrapper[4775]: E1216 15:07:42.435949 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\": container with ID starting with 51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814 not found: ID does not exist" containerID="51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.435977 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814"} err="failed to get container status \"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\": rpc error: code = NotFound desc = could not find container \"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\": container with ID starting with 51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.435998 4775 scope.go:117] "RemoveContainer" containerID="fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79" Dec 16 15:07:42 crc kubenswrapper[4775]: E1216 15:07:42.436264 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\": container with ID starting with fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79 not found: ID does not exist" containerID="fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.436296 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79"} err="failed to get container status \"fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\": rpc error: code = NotFound desc = could not find container \"fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\": container with ID starting with fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.436315 4775 scope.go:117] "RemoveContainer" containerID="bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.436536 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7"} err="failed to get container status \"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7\": rpc error: code = NotFound desc = could not find container \"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7\": container with ID starting with bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.436557 4775 scope.go:117] "RemoveContainer" containerID="cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.436809 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee"} err="failed to get container status \"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee\": rpc error: code = NotFound desc = could not find container \"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee\": container with ID starting with cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.436824 4775 scope.go:117] "RemoveContainer" containerID="84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.437072 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3"} err="failed to get container status \"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\": rpc error: code = NotFound desc = could not find container \"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\": container with ID starting with 84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.437103 4775 scope.go:117] "RemoveContainer" containerID="4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.437320 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c"} err="failed to get container status \"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\": rpc error: code = NotFound desc = could not find container \"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\": container with ID starting with 4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.437342 4775 scope.go:117] "RemoveContainer" containerID="c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.437553 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583"} err="failed to get container status \"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\": rpc error: code = NotFound desc = could not find container \"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\": container with ID starting with c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.437577 4775 scope.go:117] "RemoveContainer" containerID="e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.437768 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683"} err="failed to get container status \"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\": rpc error: code = NotFound desc = could not find container \"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\": container with ID starting with e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.437792 4775 scope.go:117] "RemoveContainer" containerID="84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.438087 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555"} err="failed to get container status \"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\": rpc error: code = NotFound desc = could not find container \"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\": container with ID starting with 84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.438116 4775 scope.go:117] "RemoveContainer" containerID="edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.438307 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c"} err="failed to get container status \"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\": rpc error: code = NotFound desc = could not find container \"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\": container with ID starting with edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.438330 4775 scope.go:117] "RemoveContainer" containerID="51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.438504 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814"} err="failed to get container status \"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\": rpc error: code = NotFound desc = could not find container \"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\": container with ID starting with 51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.438528 4775 scope.go:117] "RemoveContainer" containerID="fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.438756 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79"} err="failed to get container status \"fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\": rpc error: code = NotFound desc = could not find container \"fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\": container with ID starting with fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.438784 4775 scope.go:117] "RemoveContainer" containerID="bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.439353 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7"} err="failed to get container status \"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7\": rpc error: code = NotFound desc = could not find container \"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7\": container with ID starting with bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.439382 4775 scope.go:117] "RemoveContainer" containerID="cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.439663 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee"} err="failed to get container status \"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee\": rpc error: code = NotFound desc = could not find container \"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee\": container with ID starting with cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.439682 4775 scope.go:117] "RemoveContainer" containerID="84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.439950 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3"} err="failed to get container status \"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\": rpc error: code = NotFound desc = could not find container \"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\": container with ID starting with 84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.439975 4775 scope.go:117] "RemoveContainer" containerID="4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.440183 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c"} err="failed to get container status \"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\": rpc error: code = NotFound desc = could not find container \"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\": container with ID starting with 4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.440208 4775 scope.go:117] "RemoveContainer" containerID="c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.440928 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583"} err="failed to get container status \"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\": rpc error: code = NotFound desc = could not find container \"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\": container with ID starting with c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.440955 4775 scope.go:117] "RemoveContainer" containerID="e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.441289 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683"} err="failed to get container status \"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\": rpc error: code = NotFound desc = could not find container \"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\": container with ID starting with e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.441314 4775 scope.go:117] "RemoveContainer" containerID="84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.441698 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555"} err="failed to get container status \"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\": rpc error: code = NotFound desc = could not find container \"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\": container with ID starting with 84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.441726 4775 scope.go:117] "RemoveContainer" containerID="edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.441957 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c"} err="failed to get container status \"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\": rpc error: code = NotFound desc = could not find container \"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\": container with ID starting with edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.441979 4775 scope.go:117] "RemoveContainer" containerID="51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.442183 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814"} err="failed to get container status \"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\": rpc error: code = NotFound desc = could not find container \"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\": container with ID starting with 51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.442210 4775 scope.go:117] "RemoveContainer" containerID="fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.442408 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79"} err="failed to get container status \"fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\": rpc error: code = NotFound desc = could not find container \"fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\": container with ID starting with fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.442428 4775 scope.go:117] "RemoveContainer" containerID="bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.442675 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7"} err="failed to get container status \"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7\": rpc error: code = NotFound desc = could not find container \"bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7\": container with ID starting with bc7a1a892b834ade6a2dc13ffbff79b7b0df526f656b8d82ed999580c75b8df7 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.442705 4775 scope.go:117] "RemoveContainer" containerID="cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.442925 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee"} err="failed to get container status \"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee\": rpc error: code = NotFound desc = could not find container \"cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee\": container with ID starting with cd8a6e19f03a9eeb3ec830a68c23a5b1d54090695a9eea19d65c2a8d28b832ee not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.442949 4775 scope.go:117] "RemoveContainer" containerID="84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.443163 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3"} err="failed to get container status \"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\": rpc error: code = NotFound desc = could not find container \"84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3\": container with ID starting with 84607889b5cd240a0390038803831ff2d77b72096436f6f04987b099fdfa6cd3 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.443188 4775 scope.go:117] "RemoveContainer" containerID="4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.443455 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c"} err="failed to get container status \"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\": rpc error: code = NotFound desc = could not find container \"4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c\": container with ID starting with 4e8db45ab04366211ac61ee79e88a80dc26876c3677004aa3fb6697cbac6284c not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.443475 4775 scope.go:117] "RemoveContainer" containerID="c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.443714 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583"} err="failed to get container status \"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\": rpc error: code = NotFound desc = could not find container \"c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583\": container with ID starting with c589aab235e979cccdef12e5ac12280127b0bb0140cc759578954d0bcc8de583 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.443762 4775 scope.go:117] "RemoveContainer" containerID="e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.444154 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683"} err="failed to get container status \"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\": rpc error: code = NotFound desc = could not find container \"e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683\": container with ID starting with e75d9e8527f91bc82246c76ea969e2b317e8404d3c92a7393075e21142f47683 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.444199 4775 scope.go:117] "RemoveContainer" containerID="84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.444708 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555"} err="failed to get container status \"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\": rpc error: code = NotFound desc = could not find container \"84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555\": container with ID starting with 84f2f341953f49c2e8f19457a4a4485cad05e4e6779576640d46580668f18555 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.444736 4775 scope.go:117] "RemoveContainer" containerID="edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.445014 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c"} err="failed to get container status \"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\": rpc error: code = NotFound desc = could not find container \"edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c\": container with ID starting with edbedf627a72cc07058e61f52dc0914a491f6d4a1eb8f5ac71db47d4ab34e38c not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.445034 4775 scope.go:117] "RemoveContainer" containerID="51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.445218 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814"} err="failed to get container status \"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\": rpc error: code = NotFound desc = could not find container \"51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814\": container with ID starting with 51c34a45ce6fa34dbdc9567b3c75a607cbc7012d01e59db51f4c25b5d4c6a814 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.445238 4775 scope.go:117] "RemoveContainer" containerID="fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.445422 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79"} err="failed to get container status \"fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\": rpc error: code = NotFound desc = could not find container \"fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79\": container with ID starting with fc6bbb3366407f90184913b16ec4385f119a1b0c120ab8fc5859af1d90651e79 not found: ID does not exist" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.623784 4775 generic.go:334] "Generic (PLEG): container finished" podID="4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0" containerID="87e43a011649d6a663c9ebd6c8372cac038739ea37cb379e037d8e2ef55bbc84" exitCode=0 Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.623844 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" event={"ID":"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0","Type":"ContainerDied","Data":"87e43a011649d6a663c9ebd6c8372cac038739ea37cb379e037d8e2ef55bbc84"} Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.630810 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mc2lg_f108f76f-c79a-42b0-b5ac-714d49d9a4d5/kube-multus/2.log" Dec 16 15:07:42 crc kubenswrapper[4775]: I1216 15:07:42.631073 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mc2lg" event={"ID":"f108f76f-c79a-42b0-b5ac-714d49d9a4d5","Type":"ContainerStarted","Data":"1af58c904b36f1e078750b818b722b191f23ac55d11482080ffc90469d8d16c4"} Dec 16 15:07:43 crc kubenswrapper[4775]: I1216 15:07:43.348108 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="524488dd-74ee-43ea-ac0f-5e04d59af434" path="/var/lib/kubelet/pods/524488dd-74ee-43ea-ac0f-5e04d59af434/volumes" Dec 16 15:07:43 crc kubenswrapper[4775]: I1216 15:07:43.640854 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" event={"ID":"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0","Type":"ContainerStarted","Data":"397c149b8129e23d869b22d03c0ea0b25bdaf9a9f4f9310a611afe3231400be1"} Dec 16 15:07:43 crc kubenswrapper[4775]: I1216 15:07:43.640918 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" event={"ID":"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0","Type":"ContainerStarted","Data":"e0cdef47808f63dd97d37648e12f250aab5c2bf14ffc8119bc3a7530fffe583f"} Dec 16 15:07:43 crc kubenswrapper[4775]: I1216 15:07:43.640929 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" event={"ID":"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0","Type":"ContainerStarted","Data":"12f3979ef7c63fa22734cfccc9956d47068a4ca2274c7b7a9cddf9321a646628"} Dec 16 15:07:43 crc kubenswrapper[4775]: I1216 15:07:43.640940 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" event={"ID":"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0","Type":"ContainerStarted","Data":"2723572c9ab52355b30f614bd0e61bd827b00afda727f2f7b9664ac04e293ade"} Dec 16 15:07:43 crc kubenswrapper[4775]: I1216 15:07:43.640948 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" event={"ID":"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0","Type":"ContainerStarted","Data":"d4a41fb771fc82ec3c7bb72dff3104c9781d95049ba6eddbeeec62df1aa7a260"} Dec 16 15:07:43 crc kubenswrapper[4775]: I1216 15:07:43.640957 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" event={"ID":"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0","Type":"ContainerStarted","Data":"bbda00fbde011df46ca0517d3745a667d88b709d4d21438b50e41fcd11977421"} Dec 16 15:07:45 crc kubenswrapper[4775]: I1216 15:07:45.657909 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" event={"ID":"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0","Type":"ContainerStarted","Data":"df51a042add7d2d8eee0a811e6011d3d3fab7dc35e00c6af317c9e981bfeb67f"} Dec 16 15:07:48 crc kubenswrapper[4775]: I1216 15:07:48.690427 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" event={"ID":"4af5a06d-cc14-4fd2-b9f8-6a114feb7ed0","Type":"ContainerStarted","Data":"a6559d883a0cb0fbb1ef9abac6bdc77069f97f6f8a88dff631dc7f6670d5d8b9"} Dec 16 15:07:48 crc kubenswrapper[4775]: I1216 15:07:48.691023 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:48 crc kubenswrapper[4775]: I1216 15:07:48.691039 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:48 crc kubenswrapper[4775]: I1216 15:07:48.691073 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:48 crc kubenswrapper[4775]: I1216 15:07:48.717689 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" podStartSLOduration=8.717672261 podStartE2EDuration="8.717672261s" podCreationTimestamp="2025-12-16 15:07:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:07:48.715937617 +0000 UTC m=+793.667016560" watchObservedRunningTime="2025-12-16 15:07:48.717672261 +0000 UTC m=+793.668751184" Dec 16 15:07:48 crc kubenswrapper[4775]: I1216 15:07:48.722028 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:07:48 crc kubenswrapper[4775]: I1216 15:07:48.722313 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:08:02 crc kubenswrapper[4775]: I1216 15:08:02.869567 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:08:02 crc kubenswrapper[4775]: I1216 15:08:02.870250 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:08:11 crc kubenswrapper[4775]: I1216 15:08:11.237623 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7ptdp" Dec 16 15:08:18 crc kubenswrapper[4775]: I1216 15:08:18.986319 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx"] Dec 16 15:08:18 crc kubenswrapper[4775]: I1216 15:08:18.987818 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" Dec 16 15:08:18 crc kubenswrapper[4775]: I1216 15:08:18.990066 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 16 15:08:19 crc kubenswrapper[4775]: I1216 15:08:19.000439 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx"] Dec 16 15:08:19 crc kubenswrapper[4775]: I1216 15:08:19.176714 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/739f7090-9a46-4ae3-a85b-045a2b1e197d-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx\" (UID: \"739f7090-9a46-4ae3-a85b-045a2b1e197d\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" Dec 16 15:08:19 crc kubenswrapper[4775]: I1216 15:08:19.176765 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7ndv\" (UniqueName: \"kubernetes.io/projected/739f7090-9a46-4ae3-a85b-045a2b1e197d-kube-api-access-p7ndv\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx\" (UID: \"739f7090-9a46-4ae3-a85b-045a2b1e197d\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" Dec 16 15:08:19 crc kubenswrapper[4775]: I1216 15:08:19.176928 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/739f7090-9a46-4ae3-a85b-045a2b1e197d-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx\" (UID: \"739f7090-9a46-4ae3-a85b-045a2b1e197d\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" Dec 16 15:08:19 crc kubenswrapper[4775]: I1216 15:08:19.278476 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7ndv\" (UniqueName: \"kubernetes.io/projected/739f7090-9a46-4ae3-a85b-045a2b1e197d-kube-api-access-p7ndv\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx\" (UID: \"739f7090-9a46-4ae3-a85b-045a2b1e197d\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" Dec 16 15:08:19 crc kubenswrapper[4775]: I1216 15:08:19.278800 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/739f7090-9a46-4ae3-a85b-045a2b1e197d-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx\" (UID: \"739f7090-9a46-4ae3-a85b-045a2b1e197d\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" Dec 16 15:08:19 crc kubenswrapper[4775]: I1216 15:08:19.278979 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/739f7090-9a46-4ae3-a85b-045a2b1e197d-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx\" (UID: \"739f7090-9a46-4ae3-a85b-045a2b1e197d\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" Dec 16 15:08:19 crc kubenswrapper[4775]: I1216 15:08:19.279345 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/739f7090-9a46-4ae3-a85b-045a2b1e197d-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx\" (UID: \"739f7090-9a46-4ae3-a85b-045a2b1e197d\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" Dec 16 15:08:19 crc kubenswrapper[4775]: I1216 15:08:19.279460 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/739f7090-9a46-4ae3-a85b-045a2b1e197d-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx\" (UID: \"739f7090-9a46-4ae3-a85b-045a2b1e197d\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" Dec 16 15:08:19 crc kubenswrapper[4775]: I1216 15:08:19.302928 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7ndv\" (UniqueName: \"kubernetes.io/projected/739f7090-9a46-4ae3-a85b-045a2b1e197d-kube-api-access-p7ndv\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx\" (UID: \"739f7090-9a46-4ae3-a85b-045a2b1e197d\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" Dec 16 15:08:19 crc kubenswrapper[4775]: I1216 15:08:19.307514 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" Dec 16 15:08:19 crc kubenswrapper[4775]: I1216 15:08:19.511596 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx"] Dec 16 15:08:19 crc kubenswrapper[4775]: I1216 15:08:19.875527 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" event={"ID":"739f7090-9a46-4ae3-a85b-045a2b1e197d","Type":"ContainerStarted","Data":"cdca790e169d91bb7aa48e930126c591d9d2595f4ab6184629f8ac9985014efc"} Dec 16 15:08:19 crc kubenswrapper[4775]: I1216 15:08:19.876084 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" event={"ID":"739f7090-9a46-4ae3-a85b-045a2b1e197d","Type":"ContainerStarted","Data":"41099dc1bb09d4af8199ac56cb7529970ab2fcb5f159de15f737b5ad00595873"} Dec 16 15:08:20 crc kubenswrapper[4775]: I1216 15:08:20.887804 4775 generic.go:334] "Generic (PLEG): container finished" podID="739f7090-9a46-4ae3-a85b-045a2b1e197d" containerID="cdca790e169d91bb7aa48e930126c591d9d2595f4ab6184629f8ac9985014efc" exitCode=0 Dec 16 15:08:20 crc kubenswrapper[4775]: I1216 15:08:20.887913 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" event={"ID":"739f7090-9a46-4ae3-a85b-045a2b1e197d","Type":"ContainerDied","Data":"cdca790e169d91bb7aa48e930126c591d9d2595f4ab6184629f8ac9985014efc"} Dec 16 15:08:21 crc kubenswrapper[4775]: I1216 15:08:21.105269 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zssp7"] Dec 16 15:08:21 crc kubenswrapper[4775]: I1216 15:08:21.106628 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zssp7" Dec 16 15:08:21 crc kubenswrapper[4775]: I1216 15:08:21.120358 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zssp7"] Dec 16 15:08:21 crc kubenswrapper[4775]: I1216 15:08:21.206559 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6ec352-6085-4778-a301-012d5e7d4a6c-utilities\") pod \"redhat-operators-zssp7\" (UID: \"de6ec352-6085-4778-a301-012d5e7d4a6c\") " pod="openshift-marketplace/redhat-operators-zssp7" Dec 16 15:08:21 crc kubenswrapper[4775]: I1216 15:08:21.206611 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6ec352-6085-4778-a301-012d5e7d4a6c-catalog-content\") pod \"redhat-operators-zssp7\" (UID: \"de6ec352-6085-4778-a301-012d5e7d4a6c\") " pod="openshift-marketplace/redhat-operators-zssp7" Dec 16 15:08:21 crc kubenswrapper[4775]: I1216 15:08:21.206652 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4l9h\" (UniqueName: \"kubernetes.io/projected/de6ec352-6085-4778-a301-012d5e7d4a6c-kube-api-access-f4l9h\") pod \"redhat-operators-zssp7\" (UID: \"de6ec352-6085-4778-a301-012d5e7d4a6c\") " pod="openshift-marketplace/redhat-operators-zssp7" Dec 16 15:08:21 crc kubenswrapper[4775]: I1216 15:08:21.308374 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6ec352-6085-4778-a301-012d5e7d4a6c-utilities\") pod \"redhat-operators-zssp7\" (UID: \"de6ec352-6085-4778-a301-012d5e7d4a6c\") " pod="openshift-marketplace/redhat-operators-zssp7" Dec 16 15:08:21 crc kubenswrapper[4775]: I1216 15:08:21.308422 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6ec352-6085-4778-a301-012d5e7d4a6c-catalog-content\") pod \"redhat-operators-zssp7\" (UID: \"de6ec352-6085-4778-a301-012d5e7d4a6c\") " pod="openshift-marketplace/redhat-operators-zssp7" Dec 16 15:08:21 crc kubenswrapper[4775]: I1216 15:08:21.308456 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4l9h\" (UniqueName: \"kubernetes.io/projected/de6ec352-6085-4778-a301-012d5e7d4a6c-kube-api-access-f4l9h\") pod \"redhat-operators-zssp7\" (UID: \"de6ec352-6085-4778-a301-012d5e7d4a6c\") " pod="openshift-marketplace/redhat-operators-zssp7" Dec 16 15:08:21 crc kubenswrapper[4775]: I1216 15:08:21.309113 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6ec352-6085-4778-a301-012d5e7d4a6c-utilities\") pod \"redhat-operators-zssp7\" (UID: \"de6ec352-6085-4778-a301-012d5e7d4a6c\") " pod="openshift-marketplace/redhat-operators-zssp7" Dec 16 15:08:21 crc kubenswrapper[4775]: I1216 15:08:21.309123 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6ec352-6085-4778-a301-012d5e7d4a6c-catalog-content\") pod \"redhat-operators-zssp7\" (UID: \"de6ec352-6085-4778-a301-012d5e7d4a6c\") " pod="openshift-marketplace/redhat-operators-zssp7" Dec 16 15:08:21 crc kubenswrapper[4775]: I1216 15:08:21.336853 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4l9h\" (UniqueName: \"kubernetes.io/projected/de6ec352-6085-4778-a301-012d5e7d4a6c-kube-api-access-f4l9h\") pod \"redhat-operators-zssp7\" (UID: \"de6ec352-6085-4778-a301-012d5e7d4a6c\") " pod="openshift-marketplace/redhat-operators-zssp7" Dec 16 15:08:21 crc kubenswrapper[4775]: I1216 15:08:21.425361 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zssp7" Dec 16 15:08:21 crc kubenswrapper[4775]: I1216 15:08:21.690429 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zssp7"] Dec 16 15:08:21 crc kubenswrapper[4775]: W1216 15:08:21.780704 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde6ec352_6085_4778_a301_012d5e7d4a6c.slice/crio-5aeb9015e8835c0da302dfd6141ad5c1959ec9acd9f6269e6c4e716c16dd3b6a WatchSource:0}: Error finding container 5aeb9015e8835c0da302dfd6141ad5c1959ec9acd9f6269e6c4e716c16dd3b6a: Status 404 returned error can't find the container with id 5aeb9015e8835c0da302dfd6141ad5c1959ec9acd9f6269e6c4e716c16dd3b6a Dec 16 15:08:21 crc kubenswrapper[4775]: I1216 15:08:21.893934 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zssp7" event={"ID":"de6ec352-6085-4778-a301-012d5e7d4a6c","Type":"ContainerStarted","Data":"5aeb9015e8835c0da302dfd6141ad5c1959ec9acd9f6269e6c4e716c16dd3b6a"} Dec 16 15:08:22 crc kubenswrapper[4775]: I1216 15:08:22.901564 4775 generic.go:334] "Generic (PLEG): container finished" podID="739f7090-9a46-4ae3-a85b-045a2b1e197d" containerID="ddde550bcf81cc171e595fb944197135799b73f86815fcccce5282d0ce049317" exitCode=0 Dec 16 15:08:22 crc kubenswrapper[4775]: I1216 15:08:22.901652 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" event={"ID":"739f7090-9a46-4ae3-a85b-045a2b1e197d","Type":"ContainerDied","Data":"ddde550bcf81cc171e595fb944197135799b73f86815fcccce5282d0ce049317"} Dec 16 15:08:22 crc kubenswrapper[4775]: I1216 15:08:22.905544 4775 generic.go:334] "Generic (PLEG): container finished" podID="de6ec352-6085-4778-a301-012d5e7d4a6c" containerID="06c6f91bf35fe63f2fa2e28c0b79e70a11c3fd6c539dfdced811dd1727364cdc" exitCode=0 Dec 16 15:08:22 crc kubenswrapper[4775]: I1216 15:08:22.905581 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zssp7" event={"ID":"de6ec352-6085-4778-a301-012d5e7d4a6c","Type":"ContainerDied","Data":"06c6f91bf35fe63f2fa2e28c0b79e70a11c3fd6c539dfdced811dd1727364cdc"} Dec 16 15:08:23 crc kubenswrapper[4775]: I1216 15:08:23.914636 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zssp7" event={"ID":"de6ec352-6085-4778-a301-012d5e7d4a6c","Type":"ContainerStarted","Data":"085e76e6fe825655a7b67536087cdacf9847d3fa97285e5cb73c07ed1d48dc29"} Dec 16 15:08:23 crc kubenswrapper[4775]: I1216 15:08:23.919027 4775 generic.go:334] "Generic (PLEG): container finished" podID="739f7090-9a46-4ae3-a85b-045a2b1e197d" containerID="43b638e1435cd09cd858d302e5d207aa5adffb8fdb80f2ec89e03472a8cd0edd" exitCode=0 Dec 16 15:08:23 crc kubenswrapper[4775]: I1216 15:08:23.919094 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" event={"ID":"739f7090-9a46-4ae3-a85b-045a2b1e197d","Type":"ContainerDied","Data":"43b638e1435cd09cd858d302e5d207aa5adffb8fdb80f2ec89e03472a8cd0edd"} Dec 16 15:08:25 crc kubenswrapper[4775]: I1216 15:08:25.509290 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" Dec 16 15:08:25 crc kubenswrapper[4775]: I1216 15:08:25.621151 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7ndv\" (UniqueName: \"kubernetes.io/projected/739f7090-9a46-4ae3-a85b-045a2b1e197d-kube-api-access-p7ndv\") pod \"739f7090-9a46-4ae3-a85b-045a2b1e197d\" (UID: \"739f7090-9a46-4ae3-a85b-045a2b1e197d\") " Dec 16 15:08:25 crc kubenswrapper[4775]: I1216 15:08:25.621441 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/739f7090-9a46-4ae3-a85b-045a2b1e197d-util\") pod \"739f7090-9a46-4ae3-a85b-045a2b1e197d\" (UID: \"739f7090-9a46-4ae3-a85b-045a2b1e197d\") " Dec 16 15:08:25 crc kubenswrapper[4775]: I1216 15:08:25.621542 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/739f7090-9a46-4ae3-a85b-045a2b1e197d-bundle\") pod \"739f7090-9a46-4ae3-a85b-045a2b1e197d\" (UID: \"739f7090-9a46-4ae3-a85b-045a2b1e197d\") " Dec 16 15:08:25 crc kubenswrapper[4775]: I1216 15:08:25.622436 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/739f7090-9a46-4ae3-a85b-045a2b1e197d-bundle" (OuterVolumeSpecName: "bundle") pod "739f7090-9a46-4ae3-a85b-045a2b1e197d" (UID: "739f7090-9a46-4ae3-a85b-045a2b1e197d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:08:25 crc kubenswrapper[4775]: I1216 15:08:25.724359 4775 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/739f7090-9a46-4ae3-a85b-045a2b1e197d-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:08:25 crc kubenswrapper[4775]: I1216 15:08:25.814458 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/739f7090-9a46-4ae3-a85b-045a2b1e197d-kube-api-access-p7ndv" (OuterVolumeSpecName: "kube-api-access-p7ndv") pod "739f7090-9a46-4ae3-a85b-045a2b1e197d" (UID: "739f7090-9a46-4ae3-a85b-045a2b1e197d"). InnerVolumeSpecName "kube-api-access-p7ndv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:08:25 crc kubenswrapper[4775]: I1216 15:08:25.826030 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7ndv\" (UniqueName: \"kubernetes.io/projected/739f7090-9a46-4ae3-a85b-045a2b1e197d-kube-api-access-p7ndv\") on node \"crc\" DevicePath \"\"" Dec 16 15:08:25 crc kubenswrapper[4775]: I1216 15:08:25.849938 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/739f7090-9a46-4ae3-a85b-045a2b1e197d-util" (OuterVolumeSpecName: "util") pod "739f7090-9a46-4ae3-a85b-045a2b1e197d" (UID: "739f7090-9a46-4ae3-a85b-045a2b1e197d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:08:25 crc kubenswrapper[4775]: I1216 15:08:25.928373 4775 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/739f7090-9a46-4ae3-a85b-045a2b1e197d-util\") on node \"crc\" DevicePath \"\"" Dec 16 15:08:25 crc kubenswrapper[4775]: I1216 15:08:25.933221 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" event={"ID":"739f7090-9a46-4ae3-a85b-045a2b1e197d","Type":"ContainerDied","Data":"41099dc1bb09d4af8199ac56cb7529970ab2fcb5f159de15f737b5ad00595873"} Dec 16 15:08:25 crc kubenswrapper[4775]: I1216 15:08:25.933267 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41099dc1bb09d4af8199ac56cb7529970ab2fcb5f159de15f737b5ad00595873" Dec 16 15:08:25 crc kubenswrapper[4775]: I1216 15:08:25.933291 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx" Dec 16 15:08:27 crc kubenswrapper[4775]: I1216 15:08:27.953470 4775 generic.go:334] "Generic (PLEG): container finished" podID="de6ec352-6085-4778-a301-012d5e7d4a6c" containerID="085e76e6fe825655a7b67536087cdacf9847d3fa97285e5cb73c07ed1d48dc29" exitCode=0 Dec 16 15:08:27 crc kubenswrapper[4775]: I1216 15:08:27.953543 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zssp7" event={"ID":"de6ec352-6085-4778-a301-012d5e7d4a6c","Type":"ContainerDied","Data":"085e76e6fe825655a7b67536087cdacf9847d3fa97285e5cb73c07ed1d48dc29"} Dec 16 15:08:28 crc kubenswrapper[4775]: I1216 15:08:28.794346 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-kkdbg"] Dec 16 15:08:28 crc kubenswrapper[4775]: E1216 15:08:28.794915 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739f7090-9a46-4ae3-a85b-045a2b1e197d" containerName="extract" Dec 16 15:08:28 crc kubenswrapper[4775]: I1216 15:08:28.794931 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="739f7090-9a46-4ae3-a85b-045a2b1e197d" containerName="extract" Dec 16 15:08:28 crc kubenswrapper[4775]: E1216 15:08:28.794952 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739f7090-9a46-4ae3-a85b-045a2b1e197d" containerName="pull" Dec 16 15:08:28 crc kubenswrapper[4775]: I1216 15:08:28.794960 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="739f7090-9a46-4ae3-a85b-045a2b1e197d" containerName="pull" Dec 16 15:08:28 crc kubenswrapper[4775]: E1216 15:08:28.794970 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739f7090-9a46-4ae3-a85b-045a2b1e197d" containerName="util" Dec 16 15:08:28 crc kubenswrapper[4775]: I1216 15:08:28.794978 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="739f7090-9a46-4ae3-a85b-045a2b1e197d" containerName="util" Dec 16 15:08:28 crc kubenswrapper[4775]: I1216 15:08:28.795094 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="739f7090-9a46-4ae3-a85b-045a2b1e197d" containerName="extract" Dec 16 15:08:28 crc kubenswrapper[4775]: I1216 15:08:28.795574 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-kkdbg" Dec 16 15:08:28 crc kubenswrapper[4775]: I1216 15:08:28.797254 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 16 15:08:28 crc kubenswrapper[4775]: I1216 15:08:28.797669 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-zjkkc" Dec 16 15:08:28 crc kubenswrapper[4775]: I1216 15:08:28.798579 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 16 15:08:28 crc kubenswrapper[4775]: I1216 15:08:28.805976 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-kkdbg"] Dec 16 15:08:28 crc kubenswrapper[4775]: I1216 15:08:28.870519 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd9bj\" (UniqueName: \"kubernetes.io/projected/0318b125-3608-48d1-b19f-8fcad1785fa8-kube-api-access-xd9bj\") pod \"nmstate-operator-6769fb99d-kkdbg\" (UID: \"0318b125-3608-48d1-b19f-8fcad1785fa8\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-kkdbg" Dec 16 15:08:28 crc kubenswrapper[4775]: I1216 15:08:28.965361 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zssp7" event={"ID":"de6ec352-6085-4778-a301-012d5e7d4a6c","Type":"ContainerStarted","Data":"eae441254725300f018c8ac4e055688b09fad64835b82895c5df0bceffc2876b"} Dec 16 15:08:28 crc kubenswrapper[4775]: I1216 15:08:28.971417 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd9bj\" (UniqueName: \"kubernetes.io/projected/0318b125-3608-48d1-b19f-8fcad1785fa8-kube-api-access-xd9bj\") pod \"nmstate-operator-6769fb99d-kkdbg\" (UID: \"0318b125-3608-48d1-b19f-8fcad1785fa8\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-kkdbg" Dec 16 15:08:28 crc kubenswrapper[4775]: I1216 15:08:28.988742 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zssp7" podStartSLOduration=2.455239726 podStartE2EDuration="7.988720697s" podCreationTimestamp="2025-12-16 15:08:21 +0000 UTC" firstStartedPulling="2025-12-16 15:08:22.907072939 +0000 UTC m=+827.858151862" lastFinishedPulling="2025-12-16 15:08:28.44055391 +0000 UTC m=+833.391632833" observedRunningTime="2025-12-16 15:08:28.98184796 +0000 UTC m=+833.932926893" watchObservedRunningTime="2025-12-16 15:08:28.988720697 +0000 UTC m=+833.939799620" Dec 16 15:08:28 crc kubenswrapper[4775]: I1216 15:08:28.997816 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd9bj\" (UniqueName: \"kubernetes.io/projected/0318b125-3608-48d1-b19f-8fcad1785fa8-kube-api-access-xd9bj\") pod \"nmstate-operator-6769fb99d-kkdbg\" (UID: \"0318b125-3608-48d1-b19f-8fcad1785fa8\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-kkdbg" Dec 16 15:08:29 crc kubenswrapper[4775]: I1216 15:08:29.132246 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-kkdbg" Dec 16 15:08:29 crc kubenswrapper[4775]: I1216 15:08:29.578166 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-kkdbg"] Dec 16 15:08:29 crc kubenswrapper[4775]: W1216 15:08:29.595416 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0318b125_3608_48d1_b19f_8fcad1785fa8.slice/crio-7b7152bf0a290d65c9495f1a312cf5594433dc4e5519e81f4c8c9a9dd909219c WatchSource:0}: Error finding container 7b7152bf0a290d65c9495f1a312cf5594433dc4e5519e81f4c8c9a9dd909219c: Status 404 returned error can't find the container with id 7b7152bf0a290d65c9495f1a312cf5594433dc4e5519e81f4c8c9a9dd909219c Dec 16 15:08:29 crc kubenswrapper[4775]: I1216 15:08:29.971592 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-kkdbg" event={"ID":"0318b125-3608-48d1-b19f-8fcad1785fa8","Type":"ContainerStarted","Data":"7b7152bf0a290d65c9495f1a312cf5594433dc4e5519e81f4c8c9a9dd909219c"} Dec 16 15:08:31 crc kubenswrapper[4775]: I1216 15:08:31.425716 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zssp7" Dec 16 15:08:31 crc kubenswrapper[4775]: I1216 15:08:31.425805 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zssp7" Dec 16 15:08:32 crc kubenswrapper[4775]: I1216 15:08:32.460344 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zssp7" podUID="de6ec352-6085-4778-a301-012d5e7d4a6c" containerName="registry-server" probeResult="failure" output=< Dec 16 15:08:32 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Dec 16 15:08:32 crc kubenswrapper[4775]: > Dec 16 15:08:32 crc kubenswrapper[4775]: I1216 15:08:32.869474 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:08:32 crc kubenswrapper[4775]: I1216 15:08:32.869562 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:08:32 crc kubenswrapper[4775]: I1216 15:08:32.869609 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 15:08:32 crc kubenswrapper[4775]: I1216 15:08:32.870279 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"790666d10a8413c7b1bed65625e744b82eacfed0c75d107b7bd78a845e4df70e"} pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:08:32 crc kubenswrapper[4775]: I1216 15:08:32.870342 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" containerID="cri-o://790666d10a8413c7b1bed65625e744b82eacfed0c75d107b7bd78a845e4df70e" gracePeriod=600 Dec 16 15:08:33 crc kubenswrapper[4775]: I1216 15:08:33.014933 4775 generic.go:334] "Generic (PLEG): container finished" podID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerID="790666d10a8413c7b1bed65625e744b82eacfed0c75d107b7bd78a845e4df70e" exitCode=0 Dec 16 15:08:33 crc kubenswrapper[4775]: I1216 15:08:33.016299 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerDied","Data":"790666d10a8413c7b1bed65625e744b82eacfed0c75d107b7bd78a845e4df70e"} Dec 16 15:08:33 crc kubenswrapper[4775]: I1216 15:08:33.018132 4775 scope.go:117] "RemoveContainer" containerID="3726a17a41d21de0c1144f1afb1120defbbe2b018d4ec48bc1ed4d607865dfc9" Dec 16 15:08:34 crc kubenswrapper[4775]: I1216 15:08:34.022492 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerStarted","Data":"e2fab779748b41d2d6bca28ee35caff1c948d4988b65a4308383bcd22a0a32a5"} Dec 16 15:08:35 crc kubenswrapper[4775]: I1216 15:08:35.030216 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-kkdbg" event={"ID":"0318b125-3608-48d1-b19f-8fcad1785fa8","Type":"ContainerStarted","Data":"6de878b41b33ec3b82893e0bec8952376117faae631bf34a2cb51c383acb64b3"} Dec 16 15:08:35 crc kubenswrapper[4775]: I1216 15:08:35.052795 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-6769fb99d-kkdbg" podStartSLOduration=2.264706286 podStartE2EDuration="7.0527727s" podCreationTimestamp="2025-12-16 15:08:28 +0000 UTC" firstStartedPulling="2025-12-16 15:08:29.597307182 +0000 UTC m=+834.548386105" lastFinishedPulling="2025-12-16 15:08:34.385373596 +0000 UTC m=+839.336452519" observedRunningTime="2025-12-16 15:08:35.050877029 +0000 UTC m=+840.001955992" watchObservedRunningTime="2025-12-16 15:08:35.0527727 +0000 UTC m=+840.003851623" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.135981 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-7hgfg"] Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.137317 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-7hgfg" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.139245 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-ftm6r" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.144244 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdfbk\" (UniqueName: \"kubernetes.io/projected/6384fd2d-45e1-421e-920f-5555dc0f8a10-kube-api-access-gdfbk\") pod \"nmstate-metrics-7f7f7578db-7hgfg\" (UID: \"6384fd2d-45e1-421e-920f-5555dc0f8a10\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-7hgfg" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.146021 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-4p9vw"] Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.147219 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4p9vw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.149511 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.150120 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-7hgfg"] Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.167389 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-sc4rw"] Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.168278 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-sc4rw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.173267 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-4p9vw"] Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.245741 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdfbk\" (UniqueName: \"kubernetes.io/projected/6384fd2d-45e1-421e-920f-5555dc0f8a10-kube-api-access-gdfbk\") pod \"nmstate-metrics-7f7f7578db-7hgfg\" (UID: \"6384fd2d-45e1-421e-920f-5555dc0f8a10\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-7hgfg" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.270475 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdfbk\" (UniqueName: \"kubernetes.io/projected/6384fd2d-45e1-421e-920f-5555dc0f8a10-kube-api-access-gdfbk\") pod \"nmstate-metrics-7f7f7578db-7hgfg\" (UID: \"6384fd2d-45e1-421e-920f-5555dc0f8a10\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-7hgfg" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.282276 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-6842d"] Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.283019 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-6842d" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.284533 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.284697 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.284942 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-2jrmp" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.345025 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-6842d"] Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.346681 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5c66735d-0eb0-46a8-b2db-f65158873132-dbus-socket\") pod \"nmstate-handler-sc4rw\" (UID: \"5c66735d-0eb0-46a8-b2db-f65158873132\") " pod="openshift-nmstate/nmstate-handler-sc4rw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.346714 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5c66735d-0eb0-46a8-b2db-f65158873132-ovs-socket\") pod \"nmstate-handler-sc4rw\" (UID: \"5c66735d-0eb0-46a8-b2db-f65158873132\") " pod="openshift-nmstate/nmstate-handler-sc4rw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.346738 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/52cbae70-fde7-47d8-a118-799f6fb64f2b-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-4p9vw\" (UID: \"52cbae70-fde7-47d8-a118-799f6fb64f2b\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4p9vw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.346756 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcsw8\" (UniqueName: \"kubernetes.io/projected/5c66735d-0eb0-46a8-b2db-f65158873132-kube-api-access-dcsw8\") pod \"nmstate-handler-sc4rw\" (UID: \"5c66735d-0eb0-46a8-b2db-f65158873132\") " pod="openshift-nmstate/nmstate-handler-sc4rw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.346806 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65zgp\" (UniqueName: \"kubernetes.io/projected/52cbae70-fde7-47d8-a118-799f6fb64f2b-kube-api-access-65zgp\") pod \"nmstate-webhook-f8fb84555-4p9vw\" (UID: \"52cbae70-fde7-47d8-a118-799f6fb64f2b\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4p9vw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.346831 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5c66735d-0eb0-46a8-b2db-f65158873132-nmstate-lock\") pod \"nmstate-handler-sc4rw\" (UID: \"5c66735d-0eb0-46a8-b2db-f65158873132\") " pod="openshift-nmstate/nmstate-handler-sc4rw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.448711 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65zgp\" (UniqueName: \"kubernetes.io/projected/52cbae70-fde7-47d8-a118-799f6fb64f2b-kube-api-access-65zgp\") pod \"nmstate-webhook-f8fb84555-4p9vw\" (UID: \"52cbae70-fde7-47d8-a118-799f6fb64f2b\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4p9vw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.448754 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5c66735d-0eb0-46a8-b2db-f65158873132-nmstate-lock\") pod \"nmstate-handler-sc4rw\" (UID: \"5c66735d-0eb0-46a8-b2db-f65158873132\") " pod="openshift-nmstate/nmstate-handler-sc4rw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.448788 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqdqt\" (UniqueName: \"kubernetes.io/projected/a471fecb-d3ef-427f-a02c-30a00b513bae-kube-api-access-sqdqt\") pod \"nmstate-console-plugin-6ff7998486-6842d\" (UID: \"a471fecb-d3ef-427f-a02c-30a00b513bae\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-6842d" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.448837 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5c66735d-0eb0-46a8-b2db-f65158873132-nmstate-lock\") pod \"nmstate-handler-sc4rw\" (UID: \"5c66735d-0eb0-46a8-b2db-f65158873132\") " pod="openshift-nmstate/nmstate-handler-sc4rw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.448953 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5c66735d-0eb0-46a8-b2db-f65158873132-dbus-socket\") pod \"nmstate-handler-sc4rw\" (UID: \"5c66735d-0eb0-46a8-b2db-f65158873132\") " pod="openshift-nmstate/nmstate-handler-sc4rw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.449001 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5c66735d-0eb0-46a8-b2db-f65158873132-ovs-socket\") pod \"nmstate-handler-sc4rw\" (UID: \"5c66735d-0eb0-46a8-b2db-f65158873132\") " pod="openshift-nmstate/nmstate-handler-sc4rw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.449042 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/52cbae70-fde7-47d8-a118-799f6fb64f2b-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-4p9vw\" (UID: \"52cbae70-fde7-47d8-a118-799f6fb64f2b\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4p9vw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.449064 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcsw8\" (UniqueName: \"kubernetes.io/projected/5c66735d-0eb0-46a8-b2db-f65158873132-kube-api-access-dcsw8\") pod \"nmstate-handler-sc4rw\" (UID: \"5c66735d-0eb0-46a8-b2db-f65158873132\") " pod="openshift-nmstate/nmstate-handler-sc4rw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.449097 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a471fecb-d3ef-427f-a02c-30a00b513bae-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-6842d\" (UID: \"a471fecb-d3ef-427f-a02c-30a00b513bae\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-6842d" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.449128 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a471fecb-d3ef-427f-a02c-30a00b513bae-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-6842d\" (UID: \"a471fecb-d3ef-427f-a02c-30a00b513bae\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-6842d" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.449141 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5c66735d-0eb0-46a8-b2db-f65158873132-ovs-socket\") pod \"nmstate-handler-sc4rw\" (UID: \"5c66735d-0eb0-46a8-b2db-f65158873132\") " pod="openshift-nmstate/nmstate-handler-sc4rw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.449731 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5c66735d-0eb0-46a8-b2db-f65158873132-dbus-socket\") pod \"nmstate-handler-sc4rw\" (UID: \"5c66735d-0eb0-46a8-b2db-f65158873132\") " pod="openshift-nmstate/nmstate-handler-sc4rw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.465238 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-7hgfg" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.465772 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/52cbae70-fde7-47d8-a118-799f6fb64f2b-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-4p9vw\" (UID: \"52cbae70-fde7-47d8-a118-799f6fb64f2b\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4p9vw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.475517 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcsw8\" (UniqueName: \"kubernetes.io/projected/5c66735d-0eb0-46a8-b2db-f65158873132-kube-api-access-dcsw8\") pod \"nmstate-handler-sc4rw\" (UID: \"5c66735d-0eb0-46a8-b2db-f65158873132\") " pod="openshift-nmstate/nmstate-handler-sc4rw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.475517 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65zgp\" (UniqueName: \"kubernetes.io/projected/52cbae70-fde7-47d8-a118-799f6fb64f2b-kube-api-access-65zgp\") pod \"nmstate-webhook-f8fb84555-4p9vw\" (UID: \"52cbae70-fde7-47d8-a118-799f6fb64f2b\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4p9vw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.479622 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fc64c856b-w9btt"] Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.502710 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fc64c856b-w9btt"] Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.502821 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.549491 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74638f5e-192a-47f6-8fe2-e638637185e1-oauth-serving-cert\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.549776 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74638f5e-192a-47f6-8fe2-e638637185e1-console-serving-cert\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.549922 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74638f5e-192a-47f6-8fe2-e638637185e1-trusted-ca-bundle\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.550048 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74638f5e-192a-47f6-8fe2-e638637185e1-console-config\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.550180 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqdqt\" (UniqueName: \"kubernetes.io/projected/a471fecb-d3ef-427f-a02c-30a00b513bae-kube-api-access-sqdqt\") pod \"nmstate-console-plugin-6ff7998486-6842d\" (UID: \"a471fecb-d3ef-427f-a02c-30a00b513bae\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-6842d" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.550323 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74638f5e-192a-47f6-8fe2-e638637185e1-console-oauth-config\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.550521 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74638f5e-192a-47f6-8fe2-e638637185e1-service-ca\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.550709 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j2qv\" (UniqueName: \"kubernetes.io/projected/74638f5e-192a-47f6-8fe2-e638637185e1-kube-api-access-5j2qv\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.550878 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a471fecb-d3ef-427f-a02c-30a00b513bae-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-6842d\" (UID: \"a471fecb-d3ef-427f-a02c-30a00b513bae\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-6842d" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.551080 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a471fecb-d3ef-427f-a02c-30a00b513bae-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-6842d\" (UID: \"a471fecb-d3ef-427f-a02c-30a00b513bae\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-6842d" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.552331 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a471fecb-d3ef-427f-a02c-30a00b513bae-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-6842d\" (UID: \"a471fecb-d3ef-427f-a02c-30a00b513bae\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-6842d" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.557810 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a471fecb-d3ef-427f-a02c-30a00b513bae-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-6842d\" (UID: \"a471fecb-d3ef-427f-a02c-30a00b513bae\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-6842d" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.572325 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqdqt\" (UniqueName: \"kubernetes.io/projected/a471fecb-d3ef-427f-a02c-30a00b513bae-kube-api-access-sqdqt\") pod \"nmstate-console-plugin-6ff7998486-6842d\" (UID: \"a471fecb-d3ef-427f-a02c-30a00b513bae\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-6842d" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.597213 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-6842d" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.653042 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74638f5e-192a-47f6-8fe2-e638637185e1-service-ca\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.653098 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j2qv\" (UniqueName: \"kubernetes.io/projected/74638f5e-192a-47f6-8fe2-e638637185e1-kube-api-access-5j2qv\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.653144 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74638f5e-192a-47f6-8fe2-e638637185e1-oauth-serving-cert\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.653176 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74638f5e-192a-47f6-8fe2-e638637185e1-console-serving-cert\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.653205 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74638f5e-192a-47f6-8fe2-e638637185e1-trusted-ca-bundle\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.653226 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74638f5e-192a-47f6-8fe2-e638637185e1-console-config\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.653252 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74638f5e-192a-47f6-8fe2-e638637185e1-console-oauth-config\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.654377 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74638f5e-192a-47f6-8fe2-e638637185e1-oauth-serving-cert\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.654956 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74638f5e-192a-47f6-8fe2-e638637185e1-service-ca\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.656624 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74638f5e-192a-47f6-8fe2-e638637185e1-console-config\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.659153 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74638f5e-192a-47f6-8fe2-e638637185e1-console-serving-cert\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.659714 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74638f5e-192a-47f6-8fe2-e638637185e1-trusted-ca-bundle\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.661272 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74638f5e-192a-47f6-8fe2-e638637185e1-console-oauth-config\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.669661 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j2qv\" (UniqueName: \"kubernetes.io/projected/74638f5e-192a-47f6-8fe2-e638637185e1-kube-api-access-5j2qv\") pod \"console-6fc64c856b-w9btt\" (UID: \"74638f5e-192a-47f6-8fe2-e638637185e1\") " pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.814637 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-sc4rw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.816045 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4p9vw" Dec 16 15:08:39 crc kubenswrapper[4775]: I1216 15:08:39.864514 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:40 crc kubenswrapper[4775]: W1216 15:08:40.004664 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c66735d_0eb0_46a8_b2db_f65158873132.slice/crio-44d8f5332a5d7157c2337343c509c5399d18c8dfe958723691a76dfc5ed54e19 WatchSource:0}: Error finding container 44d8f5332a5d7157c2337343c509c5399d18c8dfe958723691a76dfc5ed54e19: Status 404 returned error can't find the container with id 44d8f5332a5d7157c2337343c509c5399d18c8dfe958723691a76dfc5ed54e19 Dec 16 15:08:40 crc kubenswrapper[4775]: I1216 15:08:40.098753 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-sc4rw" event={"ID":"5c66735d-0eb0-46a8-b2db-f65158873132","Type":"ContainerStarted","Data":"44d8f5332a5d7157c2337343c509c5399d18c8dfe958723691a76dfc5ed54e19"} Dec 16 15:08:40 crc kubenswrapper[4775]: I1216 15:08:40.259622 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-7hgfg"] Dec 16 15:08:40 crc kubenswrapper[4775]: I1216 15:08:40.286285 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-6842d"] Dec 16 15:08:40 crc kubenswrapper[4775]: W1216 15:08:40.298864 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda471fecb_d3ef_427f_a02c_30a00b513bae.slice/crio-6805685b54115f5e7c55e5fd189fbd5b562268e748ccb7329d48e2da47d6e43b WatchSource:0}: Error finding container 6805685b54115f5e7c55e5fd189fbd5b562268e748ccb7329d48e2da47d6e43b: Status 404 returned error can't find the container with id 6805685b54115f5e7c55e5fd189fbd5b562268e748ccb7329d48e2da47d6e43b Dec 16 15:08:40 crc kubenswrapper[4775]: I1216 15:08:40.384092 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fc64c856b-w9btt"] Dec 16 15:08:40 crc kubenswrapper[4775]: I1216 15:08:40.426293 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-4p9vw"] Dec 16 15:08:40 crc kubenswrapper[4775]: W1216 15:08:40.432507 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52cbae70_fde7_47d8_a118_799f6fb64f2b.slice/crio-7590b99b8d1273f0e065464a514d2365b4d6fcf8c1445451ad4b69a9b50af7cc WatchSource:0}: Error finding container 7590b99b8d1273f0e065464a514d2365b4d6fcf8c1445451ad4b69a9b50af7cc: Status 404 returned error can't find the container with id 7590b99b8d1273f0e065464a514d2365b4d6fcf8c1445451ad4b69a9b50af7cc Dec 16 15:08:41 crc kubenswrapper[4775]: I1216 15:08:41.107161 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-6842d" event={"ID":"a471fecb-d3ef-427f-a02c-30a00b513bae","Type":"ContainerStarted","Data":"6805685b54115f5e7c55e5fd189fbd5b562268e748ccb7329d48e2da47d6e43b"} Dec 16 15:08:41 crc kubenswrapper[4775]: I1216 15:08:41.108358 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4p9vw" event={"ID":"52cbae70-fde7-47d8-a118-799f6fb64f2b","Type":"ContainerStarted","Data":"7590b99b8d1273f0e065464a514d2365b4d6fcf8c1445451ad4b69a9b50af7cc"} Dec 16 15:08:41 crc kubenswrapper[4775]: I1216 15:08:41.109684 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fc64c856b-w9btt" event={"ID":"74638f5e-192a-47f6-8fe2-e638637185e1","Type":"ContainerStarted","Data":"716807753ae2a4dbbdb88b289069e083fa899ad510b9a69ebeb035c5fd1b13b5"} Dec 16 15:08:41 crc kubenswrapper[4775]: I1216 15:08:41.109736 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fc64c856b-w9btt" event={"ID":"74638f5e-192a-47f6-8fe2-e638637185e1","Type":"ContainerStarted","Data":"ce6442193d61d27a82a68f0769fcbadddf1fdcd50c9b770109a6eb7aa850c97f"} Dec 16 15:08:41 crc kubenswrapper[4775]: I1216 15:08:41.111273 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-7hgfg" event={"ID":"6384fd2d-45e1-421e-920f-5555dc0f8a10","Type":"ContainerStarted","Data":"e423dbe6f326a5bc8dc9a2e8fd878e47219a1363fc83bd182db67a1715d905fa"} Dec 16 15:08:41 crc kubenswrapper[4775]: I1216 15:08:41.128397 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fc64c856b-w9btt" podStartSLOduration=2.128373048 podStartE2EDuration="2.128373048s" podCreationTimestamp="2025-12-16 15:08:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:08:41.126627972 +0000 UTC m=+846.077706905" watchObservedRunningTime="2025-12-16 15:08:41.128373048 +0000 UTC m=+846.079452021" Dec 16 15:08:41 crc kubenswrapper[4775]: I1216 15:08:41.467029 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zssp7" Dec 16 15:08:41 crc kubenswrapper[4775]: I1216 15:08:41.510875 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zssp7" Dec 16 15:08:41 crc kubenswrapper[4775]: I1216 15:08:41.698670 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zssp7"] Dec 16 15:08:43 crc kubenswrapper[4775]: I1216 15:08:43.125924 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zssp7" podUID="de6ec352-6085-4778-a301-012d5e7d4a6c" containerName="registry-server" containerID="cri-o://eae441254725300f018c8ac4e055688b09fad64835b82895c5df0bceffc2876b" gracePeriod=2 Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.091741 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zssp7" Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.142335 4775 generic.go:334] "Generic (PLEG): container finished" podID="de6ec352-6085-4778-a301-012d5e7d4a6c" containerID="eae441254725300f018c8ac4e055688b09fad64835b82895c5df0bceffc2876b" exitCode=0 Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.142414 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zssp7" Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.142446 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zssp7" event={"ID":"de6ec352-6085-4778-a301-012d5e7d4a6c","Type":"ContainerDied","Data":"eae441254725300f018c8ac4e055688b09fad64835b82895c5df0bceffc2876b"} Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.143156 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zssp7" event={"ID":"de6ec352-6085-4778-a301-012d5e7d4a6c","Type":"ContainerDied","Data":"5aeb9015e8835c0da302dfd6141ad5c1959ec9acd9f6269e6c4e716c16dd3b6a"} Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.143184 4775 scope.go:117] "RemoveContainer" containerID="eae441254725300f018c8ac4e055688b09fad64835b82895c5df0bceffc2876b" Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.182775 4775 scope.go:117] "RemoveContainer" containerID="085e76e6fe825655a7b67536087cdacf9847d3fa97285e5cb73c07ed1d48dc29" Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.224865 4775 scope.go:117] "RemoveContainer" containerID="06c6f91bf35fe63f2fa2e28c0b79e70a11c3fd6c539dfdced811dd1727364cdc" Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.241622 4775 scope.go:117] "RemoveContainer" containerID="eae441254725300f018c8ac4e055688b09fad64835b82895c5df0bceffc2876b" Dec 16 15:08:44 crc kubenswrapper[4775]: E1216 15:08:44.242016 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eae441254725300f018c8ac4e055688b09fad64835b82895c5df0bceffc2876b\": container with ID starting with eae441254725300f018c8ac4e055688b09fad64835b82895c5df0bceffc2876b not found: ID does not exist" containerID="eae441254725300f018c8ac4e055688b09fad64835b82895c5df0bceffc2876b" Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.242070 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eae441254725300f018c8ac4e055688b09fad64835b82895c5df0bceffc2876b"} err="failed to get container status \"eae441254725300f018c8ac4e055688b09fad64835b82895c5df0bceffc2876b\": rpc error: code = NotFound desc = could not find container \"eae441254725300f018c8ac4e055688b09fad64835b82895c5df0bceffc2876b\": container with ID starting with eae441254725300f018c8ac4e055688b09fad64835b82895c5df0bceffc2876b not found: ID does not exist" Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.242106 4775 scope.go:117] "RemoveContainer" containerID="085e76e6fe825655a7b67536087cdacf9847d3fa97285e5cb73c07ed1d48dc29" Dec 16 15:08:44 crc kubenswrapper[4775]: E1216 15:08:44.242562 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"085e76e6fe825655a7b67536087cdacf9847d3fa97285e5cb73c07ed1d48dc29\": container with ID starting with 085e76e6fe825655a7b67536087cdacf9847d3fa97285e5cb73c07ed1d48dc29 not found: ID does not exist" containerID="085e76e6fe825655a7b67536087cdacf9847d3fa97285e5cb73c07ed1d48dc29" Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.242591 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"085e76e6fe825655a7b67536087cdacf9847d3fa97285e5cb73c07ed1d48dc29"} err="failed to get container status \"085e76e6fe825655a7b67536087cdacf9847d3fa97285e5cb73c07ed1d48dc29\": rpc error: code = NotFound desc = could not find container \"085e76e6fe825655a7b67536087cdacf9847d3fa97285e5cb73c07ed1d48dc29\": container with ID starting with 085e76e6fe825655a7b67536087cdacf9847d3fa97285e5cb73c07ed1d48dc29 not found: ID does not exist" Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.242607 4775 scope.go:117] "RemoveContainer" containerID="06c6f91bf35fe63f2fa2e28c0b79e70a11c3fd6c539dfdced811dd1727364cdc" Dec 16 15:08:44 crc kubenswrapper[4775]: E1216 15:08:44.242860 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06c6f91bf35fe63f2fa2e28c0b79e70a11c3fd6c539dfdced811dd1727364cdc\": container with ID starting with 06c6f91bf35fe63f2fa2e28c0b79e70a11c3fd6c539dfdced811dd1727364cdc not found: ID does not exist" containerID="06c6f91bf35fe63f2fa2e28c0b79e70a11c3fd6c539dfdced811dd1727364cdc" Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.242940 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06c6f91bf35fe63f2fa2e28c0b79e70a11c3fd6c539dfdced811dd1727364cdc"} err="failed to get container status \"06c6f91bf35fe63f2fa2e28c0b79e70a11c3fd6c539dfdced811dd1727364cdc\": rpc error: code = NotFound desc = could not find container \"06c6f91bf35fe63f2fa2e28c0b79e70a11c3fd6c539dfdced811dd1727364cdc\": container with ID starting with 06c6f91bf35fe63f2fa2e28c0b79e70a11c3fd6c539dfdced811dd1727364cdc not found: ID does not exist" Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.247219 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4l9h\" (UniqueName: \"kubernetes.io/projected/de6ec352-6085-4778-a301-012d5e7d4a6c-kube-api-access-f4l9h\") pod \"de6ec352-6085-4778-a301-012d5e7d4a6c\" (UID: \"de6ec352-6085-4778-a301-012d5e7d4a6c\") " Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.247325 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6ec352-6085-4778-a301-012d5e7d4a6c-catalog-content\") pod \"de6ec352-6085-4778-a301-012d5e7d4a6c\" (UID: \"de6ec352-6085-4778-a301-012d5e7d4a6c\") " Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.247393 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6ec352-6085-4778-a301-012d5e7d4a6c-utilities\") pod \"de6ec352-6085-4778-a301-012d5e7d4a6c\" (UID: \"de6ec352-6085-4778-a301-012d5e7d4a6c\") " Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.248494 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de6ec352-6085-4778-a301-012d5e7d4a6c-utilities" (OuterVolumeSpecName: "utilities") pod "de6ec352-6085-4778-a301-012d5e7d4a6c" (UID: "de6ec352-6085-4778-a301-012d5e7d4a6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.259374 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de6ec352-6085-4778-a301-012d5e7d4a6c-kube-api-access-f4l9h" (OuterVolumeSpecName: "kube-api-access-f4l9h") pod "de6ec352-6085-4778-a301-012d5e7d4a6c" (UID: "de6ec352-6085-4778-a301-012d5e7d4a6c"). InnerVolumeSpecName "kube-api-access-f4l9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.348984 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6ec352-6085-4778-a301-012d5e7d4a6c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.349017 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4l9h\" (UniqueName: \"kubernetes.io/projected/de6ec352-6085-4778-a301-012d5e7d4a6c-kube-api-access-f4l9h\") on node \"crc\" DevicePath \"\"" Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.379744 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de6ec352-6085-4778-a301-012d5e7d4a6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de6ec352-6085-4778-a301-012d5e7d4a6c" (UID: "de6ec352-6085-4778-a301-012d5e7d4a6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.450177 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6ec352-6085-4778-a301-012d5e7d4a6c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.476231 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zssp7"] Dec 16 15:08:44 crc kubenswrapper[4775]: I1216 15:08:44.480590 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zssp7"] Dec 16 15:08:45 crc kubenswrapper[4775]: I1216 15:08:45.150846 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4p9vw" event={"ID":"52cbae70-fde7-47d8-a118-799f6fb64f2b","Type":"ContainerStarted","Data":"69617a763894f6bdee9d0191ff56959484a7d54db4ad1e1f9ba103c87d7b7a60"} Dec 16 15:08:45 crc kubenswrapper[4775]: I1216 15:08:45.151036 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4p9vw" Dec 16 15:08:45 crc kubenswrapper[4775]: I1216 15:08:45.152342 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-7hgfg" event={"ID":"6384fd2d-45e1-421e-920f-5555dc0f8a10","Type":"ContainerStarted","Data":"7c339295ad386f75dca62eb7e27a409b372a0c433301f162bb85c65053735306"} Dec 16 15:08:45 crc kubenswrapper[4775]: I1216 15:08:45.157083 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-6842d" event={"ID":"a471fecb-d3ef-427f-a02c-30a00b513bae","Type":"ContainerStarted","Data":"81bf9ec1c32483d7722d24eda5dc9177b32799d7eb6d208e57e1fbc54dd38681"} Dec 16 15:08:45 crc kubenswrapper[4775]: I1216 15:08:45.158829 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-sc4rw" event={"ID":"5c66735d-0eb0-46a8-b2db-f65158873132","Type":"ContainerStarted","Data":"0a09a500f6b163c56ed09393ec492bb58628964f4f0a464ef6793082ad0d9611"} Dec 16 15:08:45 crc kubenswrapper[4775]: I1216 15:08:45.159009 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-sc4rw" Dec 16 15:08:45 crc kubenswrapper[4775]: I1216 15:08:45.171281 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4p9vw" podStartSLOduration=2.480376072 podStartE2EDuration="6.171261662s" podCreationTimestamp="2025-12-16 15:08:39 +0000 UTC" firstStartedPulling="2025-12-16 15:08:40.435105712 +0000 UTC m=+845.386184635" lastFinishedPulling="2025-12-16 15:08:44.125991302 +0000 UTC m=+849.077070225" observedRunningTime="2025-12-16 15:08:45.166395416 +0000 UTC m=+850.117474359" watchObservedRunningTime="2025-12-16 15:08:45.171261662 +0000 UTC m=+850.122340575" Dec 16 15:08:45 crc kubenswrapper[4775]: I1216 15:08:45.189149 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-6842d" podStartSLOduration=2.367791477 podStartE2EDuration="6.189129115s" podCreationTimestamp="2025-12-16 15:08:39 +0000 UTC" firstStartedPulling="2025-12-16 15:08:40.302257978 +0000 UTC m=+845.253336901" lastFinishedPulling="2025-12-16 15:08:44.123595606 +0000 UTC m=+849.074674539" observedRunningTime="2025-12-16 15:08:45.186361076 +0000 UTC m=+850.137440019" watchObservedRunningTime="2025-12-16 15:08:45.189129115 +0000 UTC m=+850.140208038" Dec 16 15:08:45 crc kubenswrapper[4775]: I1216 15:08:45.203105 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-sc4rw" podStartSLOduration=2.087347034 podStartE2EDuration="6.203084442s" podCreationTimestamp="2025-12-16 15:08:39 +0000 UTC" firstStartedPulling="2025-12-16 15:08:40.007075672 +0000 UTC m=+844.958154595" lastFinishedPulling="2025-12-16 15:08:44.12281308 +0000 UTC m=+849.073892003" observedRunningTime="2025-12-16 15:08:45.201914984 +0000 UTC m=+850.152993937" watchObservedRunningTime="2025-12-16 15:08:45.203084442 +0000 UTC m=+850.154163365" Dec 16 15:08:45 crc kubenswrapper[4775]: I1216 15:08:45.352217 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de6ec352-6085-4778-a301-012d5e7d4a6c" path="/var/lib/kubelet/pods/de6ec352-6085-4778-a301-012d5e7d4a6c/volumes" Dec 16 15:08:47 crc kubenswrapper[4775]: I1216 15:08:47.186106 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-7hgfg" event={"ID":"6384fd2d-45e1-421e-920f-5555dc0f8a10","Type":"ContainerStarted","Data":"c24711ea5dac2a5bff77f0e43fe9f1da70fde3cc7b76cd05adfed0a279e81fb3"} Dec 16 15:08:47 crc kubenswrapper[4775]: I1216 15:08:47.202021 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-7hgfg" podStartSLOduration=1.7531346989999999 podStartE2EDuration="8.202002177s" podCreationTimestamp="2025-12-16 15:08:39 +0000 UTC" firstStartedPulling="2025-12-16 15:08:40.293800757 +0000 UTC m=+845.244879680" lastFinishedPulling="2025-12-16 15:08:46.742668235 +0000 UTC m=+851.693747158" observedRunningTime="2025-12-16 15:08:47.199782486 +0000 UTC m=+852.150861429" watchObservedRunningTime="2025-12-16 15:08:47.202002177 +0000 UTC m=+852.153081100" Dec 16 15:08:49 crc kubenswrapper[4775]: I1216 15:08:49.852652 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-sc4rw" Dec 16 15:08:49 crc kubenswrapper[4775]: I1216 15:08:49.865273 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:49 crc kubenswrapper[4775]: I1216 15:08:49.866482 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:49 crc kubenswrapper[4775]: I1216 15:08:49.874051 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:50 crc kubenswrapper[4775]: I1216 15:08:50.210537 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6fc64c856b-w9btt" Dec 16 15:08:50 crc kubenswrapper[4775]: I1216 15:08:50.259435 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-fc2jr"] Dec 16 15:08:59 crc kubenswrapper[4775]: I1216 15:08:59.827917 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4p9vw" Dec 16 15:09:13 crc kubenswrapper[4775]: I1216 15:09:13.658295 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66"] Dec 16 15:09:13 crc kubenswrapper[4775]: E1216 15:09:13.659474 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6ec352-6085-4778-a301-012d5e7d4a6c" containerName="extract-utilities" Dec 16 15:09:13 crc kubenswrapper[4775]: I1216 15:09:13.659496 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6ec352-6085-4778-a301-012d5e7d4a6c" containerName="extract-utilities" Dec 16 15:09:13 crc kubenswrapper[4775]: E1216 15:09:13.659510 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6ec352-6085-4778-a301-012d5e7d4a6c" containerName="extract-content" Dec 16 15:09:13 crc kubenswrapper[4775]: I1216 15:09:13.659521 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6ec352-6085-4778-a301-012d5e7d4a6c" containerName="extract-content" Dec 16 15:09:13 crc kubenswrapper[4775]: E1216 15:09:13.659537 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6ec352-6085-4778-a301-012d5e7d4a6c" containerName="registry-server" Dec 16 15:09:13 crc kubenswrapper[4775]: I1216 15:09:13.659547 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6ec352-6085-4778-a301-012d5e7d4a6c" containerName="registry-server" Dec 16 15:09:13 crc kubenswrapper[4775]: I1216 15:09:13.659721 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6ec352-6085-4778-a301-012d5e7d4a6c" containerName="registry-server" Dec 16 15:09:13 crc kubenswrapper[4775]: I1216 15:09:13.661022 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" Dec 16 15:09:13 crc kubenswrapper[4775]: I1216 15:09:13.668876 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66"] Dec 16 15:09:13 crc kubenswrapper[4775]: I1216 15:09:13.671944 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 16 15:09:13 crc kubenswrapper[4775]: I1216 15:09:13.823811 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6w4c\" (UniqueName: \"kubernetes.io/projected/50c7dffe-e977-448f-bcdd-7a68df1cefca-kube-api-access-n6w4c\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66\" (UID: \"50c7dffe-e977-448f-bcdd-7a68df1cefca\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" Dec 16 15:09:13 crc kubenswrapper[4775]: I1216 15:09:13.823935 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50c7dffe-e977-448f-bcdd-7a68df1cefca-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66\" (UID: \"50c7dffe-e977-448f-bcdd-7a68df1cefca\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" Dec 16 15:09:13 crc kubenswrapper[4775]: I1216 15:09:13.823978 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50c7dffe-e977-448f-bcdd-7a68df1cefca-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66\" (UID: \"50c7dffe-e977-448f-bcdd-7a68df1cefca\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" Dec 16 15:09:13 crc kubenswrapper[4775]: I1216 15:09:13.925393 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50c7dffe-e977-448f-bcdd-7a68df1cefca-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66\" (UID: \"50c7dffe-e977-448f-bcdd-7a68df1cefca\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" Dec 16 15:09:13 crc kubenswrapper[4775]: I1216 15:09:13.925469 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50c7dffe-e977-448f-bcdd-7a68df1cefca-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66\" (UID: \"50c7dffe-e977-448f-bcdd-7a68df1cefca\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" Dec 16 15:09:13 crc kubenswrapper[4775]: I1216 15:09:13.925529 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6w4c\" (UniqueName: \"kubernetes.io/projected/50c7dffe-e977-448f-bcdd-7a68df1cefca-kube-api-access-n6w4c\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66\" (UID: \"50c7dffe-e977-448f-bcdd-7a68df1cefca\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" Dec 16 15:09:13 crc kubenswrapper[4775]: I1216 15:09:13.926350 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50c7dffe-e977-448f-bcdd-7a68df1cefca-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66\" (UID: \"50c7dffe-e977-448f-bcdd-7a68df1cefca\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" Dec 16 15:09:13 crc kubenswrapper[4775]: I1216 15:09:13.926520 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50c7dffe-e977-448f-bcdd-7a68df1cefca-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66\" (UID: \"50c7dffe-e977-448f-bcdd-7a68df1cefca\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" Dec 16 15:09:13 crc kubenswrapper[4775]: I1216 15:09:13.953006 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6w4c\" (UniqueName: \"kubernetes.io/projected/50c7dffe-e977-448f-bcdd-7a68df1cefca-kube-api-access-n6w4c\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66\" (UID: \"50c7dffe-e977-448f-bcdd-7a68df1cefca\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" Dec 16 15:09:13 crc kubenswrapper[4775]: I1216 15:09:13.987778 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" Dec 16 15:09:14 crc kubenswrapper[4775]: I1216 15:09:14.236284 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66"] Dec 16 15:09:14 crc kubenswrapper[4775]: I1216 15:09:14.352251 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" event={"ID":"50c7dffe-e977-448f-bcdd-7a68df1cefca","Type":"ContainerStarted","Data":"f94a3e47dcab56f11272b5d0091bfed2a75b67ec734e4f24af63b7bfe19d5d34"} Dec 16 15:09:15 crc kubenswrapper[4775]: I1216 15:09:15.306822 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-fc2jr" podUID="c21af7b0-6f27-43de-8c44-6e6519262019" containerName="console" containerID="cri-o://fb988c5ad1b7d22b38dc7e1aab4ca8cf2e49435fdfaff32984e0beb3e9fea217" gracePeriod=15 Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.212223 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-fc2jr_c21af7b0-6f27-43de-8c44-6e6519262019/console/0.log" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.212654 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.298669 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-trusted-ca-bundle\") pod \"c21af7b0-6f27-43de-8c44-6e6519262019\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.298946 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-console-config\") pod \"c21af7b0-6f27-43de-8c44-6e6519262019\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.299032 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c21af7b0-6f27-43de-8c44-6e6519262019-console-oauth-config\") pod \"c21af7b0-6f27-43de-8c44-6e6519262019\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.299139 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-oauth-serving-cert\") pod \"c21af7b0-6f27-43de-8c44-6e6519262019\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.299227 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsgxd\" (UniqueName: \"kubernetes.io/projected/c21af7b0-6f27-43de-8c44-6e6519262019-kube-api-access-tsgxd\") pod \"c21af7b0-6f27-43de-8c44-6e6519262019\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.299388 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c21af7b0-6f27-43de-8c44-6e6519262019-console-serving-cert\") pod \"c21af7b0-6f27-43de-8c44-6e6519262019\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.299458 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-service-ca\") pod \"c21af7b0-6f27-43de-8c44-6e6519262019\" (UID: \"c21af7b0-6f27-43de-8c44-6e6519262019\") " Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.299671 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c21af7b0-6f27-43de-8c44-6e6519262019" (UID: "c21af7b0-6f27-43de-8c44-6e6519262019"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.299718 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-console-config" (OuterVolumeSpecName: "console-config") pod "c21af7b0-6f27-43de-8c44-6e6519262019" (UID: "c21af7b0-6f27-43de-8c44-6e6519262019"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.299743 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c21af7b0-6f27-43de-8c44-6e6519262019" (UID: "c21af7b0-6f27-43de-8c44-6e6519262019"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.300061 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-service-ca" (OuterVolumeSpecName: "service-ca") pod "c21af7b0-6f27-43de-8c44-6e6519262019" (UID: "c21af7b0-6f27-43de-8c44-6e6519262019"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.305402 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c21af7b0-6f27-43de-8c44-6e6519262019-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c21af7b0-6f27-43de-8c44-6e6519262019" (UID: "c21af7b0-6f27-43de-8c44-6e6519262019"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.305427 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c21af7b0-6f27-43de-8c44-6e6519262019-kube-api-access-tsgxd" (OuterVolumeSpecName: "kube-api-access-tsgxd") pod "c21af7b0-6f27-43de-8c44-6e6519262019" (UID: "c21af7b0-6f27-43de-8c44-6e6519262019"). InnerVolumeSpecName "kube-api-access-tsgxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.305956 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c21af7b0-6f27-43de-8c44-6e6519262019-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c21af7b0-6f27-43de-8c44-6e6519262019" (UID: "c21af7b0-6f27-43de-8c44-6e6519262019"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.364783 4775 generic.go:334] "Generic (PLEG): container finished" podID="50c7dffe-e977-448f-bcdd-7a68df1cefca" containerID="b581e419791418f7434ea8b986b9d50af5b20baa1bb6ee19b6fed6332fcf02c0" exitCode=0 Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.364922 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" event={"ID":"50c7dffe-e977-448f-bcdd-7a68df1cefca","Type":"ContainerDied","Data":"b581e419791418f7434ea8b986b9d50af5b20baa1bb6ee19b6fed6332fcf02c0"} Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.368127 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-fc2jr_c21af7b0-6f27-43de-8c44-6e6519262019/console/0.log" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.368165 4775 generic.go:334] "Generic (PLEG): container finished" podID="c21af7b0-6f27-43de-8c44-6e6519262019" containerID="fb988c5ad1b7d22b38dc7e1aab4ca8cf2e49435fdfaff32984e0beb3e9fea217" exitCode=2 Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.368190 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-fc2jr" event={"ID":"c21af7b0-6f27-43de-8c44-6e6519262019","Type":"ContainerDied","Data":"fb988c5ad1b7d22b38dc7e1aab4ca8cf2e49435fdfaff32984e0beb3e9fea217"} Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.368220 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-fc2jr" event={"ID":"c21af7b0-6f27-43de-8c44-6e6519262019","Type":"ContainerDied","Data":"9f1165088cb46e8c739d4ada51d43545483d6747c859bc60be90e871f9491b15"} Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.368248 4775 scope.go:117] "RemoveContainer" containerID="fb988c5ad1b7d22b38dc7e1aab4ca8cf2e49435fdfaff32984e0beb3e9fea217" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.368361 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-fc2jr" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.389587 4775 scope.go:117] "RemoveContainer" containerID="fb988c5ad1b7d22b38dc7e1aab4ca8cf2e49435fdfaff32984e0beb3e9fea217" Dec 16 15:09:16 crc kubenswrapper[4775]: E1216 15:09:16.390678 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb988c5ad1b7d22b38dc7e1aab4ca8cf2e49435fdfaff32984e0beb3e9fea217\": container with ID starting with fb988c5ad1b7d22b38dc7e1aab4ca8cf2e49435fdfaff32984e0beb3e9fea217 not found: ID does not exist" containerID="fb988c5ad1b7d22b38dc7e1aab4ca8cf2e49435fdfaff32984e0beb3e9fea217" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.390789 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb988c5ad1b7d22b38dc7e1aab4ca8cf2e49435fdfaff32984e0beb3e9fea217"} err="failed to get container status \"fb988c5ad1b7d22b38dc7e1aab4ca8cf2e49435fdfaff32984e0beb3e9fea217\": rpc error: code = NotFound desc = could not find container \"fb988c5ad1b7d22b38dc7e1aab4ca8cf2e49435fdfaff32984e0beb3e9fea217\": container with ID starting with fb988c5ad1b7d22b38dc7e1aab4ca8cf2e49435fdfaff32984e0beb3e9fea217 not found: ID does not exist" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.400907 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsgxd\" (UniqueName: \"kubernetes.io/projected/c21af7b0-6f27-43de-8c44-6e6519262019-kube-api-access-tsgxd\") on node \"crc\" DevicePath \"\"" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.401049 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.401068 4775 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c21af7b0-6f27-43de-8c44-6e6519262019-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.401081 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.401092 4775 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-console-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.401103 4775 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c21af7b0-6f27-43de-8c44-6e6519262019-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.401115 4775 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c21af7b0-6f27-43de-8c44-6e6519262019-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.404193 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-fc2jr"] Dec 16 15:09:16 crc kubenswrapper[4775]: I1216 15:09:16.407978 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-fc2jr"] Dec 16 15:09:17 crc kubenswrapper[4775]: I1216 15:09:17.344880 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c21af7b0-6f27-43de-8c44-6e6519262019" path="/var/lib/kubelet/pods/c21af7b0-6f27-43de-8c44-6e6519262019/volumes" Dec 16 15:09:18 crc kubenswrapper[4775]: I1216 15:09:18.384913 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" event={"ID":"50c7dffe-e977-448f-bcdd-7a68df1cefca","Type":"ContainerStarted","Data":"874e46311b5770f6af875b347ad83df08b9f223152581b76fbe22c62d11c7cb6"} Dec 16 15:09:19 crc kubenswrapper[4775]: I1216 15:09:19.391976 4775 generic.go:334] "Generic (PLEG): container finished" podID="50c7dffe-e977-448f-bcdd-7a68df1cefca" containerID="874e46311b5770f6af875b347ad83df08b9f223152581b76fbe22c62d11c7cb6" exitCode=0 Dec 16 15:09:19 crc kubenswrapper[4775]: I1216 15:09:19.392028 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" event={"ID":"50c7dffe-e977-448f-bcdd-7a68df1cefca","Type":"ContainerDied","Data":"874e46311b5770f6af875b347ad83df08b9f223152581b76fbe22c62d11c7cb6"} Dec 16 15:09:20 crc kubenswrapper[4775]: I1216 15:09:20.402300 4775 generic.go:334] "Generic (PLEG): container finished" podID="50c7dffe-e977-448f-bcdd-7a68df1cefca" containerID="c3a1fa7eeb57ff782cb1dc2845815136a21affc35785d3e542e926ff66169b35" exitCode=0 Dec 16 15:09:20 crc kubenswrapper[4775]: I1216 15:09:20.402351 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" event={"ID":"50c7dffe-e977-448f-bcdd-7a68df1cefca","Type":"ContainerDied","Data":"c3a1fa7eeb57ff782cb1dc2845815136a21affc35785d3e542e926ff66169b35"} Dec 16 15:09:21 crc kubenswrapper[4775]: I1216 15:09:21.730996 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" Dec 16 15:09:21 crc kubenswrapper[4775]: I1216 15:09:21.816816 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50c7dffe-e977-448f-bcdd-7a68df1cefca-util\") pod \"50c7dffe-e977-448f-bcdd-7a68df1cefca\" (UID: \"50c7dffe-e977-448f-bcdd-7a68df1cefca\") " Dec 16 15:09:21 crc kubenswrapper[4775]: I1216 15:09:21.816879 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6w4c\" (UniqueName: \"kubernetes.io/projected/50c7dffe-e977-448f-bcdd-7a68df1cefca-kube-api-access-n6w4c\") pod \"50c7dffe-e977-448f-bcdd-7a68df1cefca\" (UID: \"50c7dffe-e977-448f-bcdd-7a68df1cefca\") " Dec 16 15:09:21 crc kubenswrapper[4775]: I1216 15:09:21.816934 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50c7dffe-e977-448f-bcdd-7a68df1cefca-bundle\") pod \"50c7dffe-e977-448f-bcdd-7a68df1cefca\" (UID: \"50c7dffe-e977-448f-bcdd-7a68df1cefca\") " Dec 16 15:09:21 crc kubenswrapper[4775]: I1216 15:09:21.817969 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50c7dffe-e977-448f-bcdd-7a68df1cefca-bundle" (OuterVolumeSpecName: "bundle") pod "50c7dffe-e977-448f-bcdd-7a68df1cefca" (UID: "50c7dffe-e977-448f-bcdd-7a68df1cefca"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:09:21 crc kubenswrapper[4775]: I1216 15:09:21.823082 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50c7dffe-e977-448f-bcdd-7a68df1cefca-kube-api-access-n6w4c" (OuterVolumeSpecName: "kube-api-access-n6w4c") pod "50c7dffe-e977-448f-bcdd-7a68df1cefca" (UID: "50c7dffe-e977-448f-bcdd-7a68df1cefca"). InnerVolumeSpecName "kube-api-access-n6w4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:09:21 crc kubenswrapper[4775]: I1216 15:09:21.827566 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50c7dffe-e977-448f-bcdd-7a68df1cefca-util" (OuterVolumeSpecName: "util") pod "50c7dffe-e977-448f-bcdd-7a68df1cefca" (UID: "50c7dffe-e977-448f-bcdd-7a68df1cefca"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:09:21 crc kubenswrapper[4775]: I1216 15:09:21.917923 4775 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50c7dffe-e977-448f-bcdd-7a68df1cefca-util\") on node \"crc\" DevicePath \"\"" Dec 16 15:09:21 crc kubenswrapper[4775]: I1216 15:09:21.917965 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6w4c\" (UniqueName: \"kubernetes.io/projected/50c7dffe-e977-448f-bcdd-7a68df1cefca-kube-api-access-n6w4c\") on node \"crc\" DevicePath \"\"" Dec 16 15:09:21 crc kubenswrapper[4775]: I1216 15:09:21.917977 4775 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50c7dffe-e977-448f-bcdd-7a68df1cefca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:09:22 crc kubenswrapper[4775]: I1216 15:09:22.419516 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" event={"ID":"50c7dffe-e977-448f-bcdd-7a68df1cefca","Type":"ContainerDied","Data":"f94a3e47dcab56f11272b5d0091bfed2a75b67ec734e4f24af63b7bfe19d5d34"} Dec 16 15:09:22 crc kubenswrapper[4775]: I1216 15:09:22.419577 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f94a3e47dcab56f11272b5d0091bfed2a75b67ec734e4f24af63b7bfe19d5d34" Dec 16 15:09:22 crc kubenswrapper[4775]: I1216 15:09:22.419587 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66" Dec 16 15:09:31 crc kubenswrapper[4775]: I1216 15:09:31.790672 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk"] Dec 16 15:09:31 crc kubenswrapper[4775]: E1216 15:09:31.791855 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c7dffe-e977-448f-bcdd-7a68df1cefca" containerName="pull" Dec 16 15:09:31 crc kubenswrapper[4775]: I1216 15:09:31.791874 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c7dffe-e977-448f-bcdd-7a68df1cefca" containerName="pull" Dec 16 15:09:31 crc kubenswrapper[4775]: E1216 15:09:31.791922 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c7dffe-e977-448f-bcdd-7a68df1cefca" containerName="extract" Dec 16 15:09:31 crc kubenswrapper[4775]: I1216 15:09:31.791929 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c7dffe-e977-448f-bcdd-7a68df1cefca" containerName="extract" Dec 16 15:09:31 crc kubenswrapper[4775]: E1216 15:09:31.791941 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c7dffe-e977-448f-bcdd-7a68df1cefca" containerName="util" Dec 16 15:09:31 crc kubenswrapper[4775]: I1216 15:09:31.791948 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c7dffe-e977-448f-bcdd-7a68df1cefca" containerName="util" Dec 16 15:09:31 crc kubenswrapper[4775]: E1216 15:09:31.791957 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21af7b0-6f27-43de-8c44-6e6519262019" containerName="console" Dec 16 15:09:31 crc kubenswrapper[4775]: I1216 15:09:31.791964 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21af7b0-6f27-43de-8c44-6e6519262019" containerName="console" Dec 16 15:09:31 crc kubenswrapper[4775]: I1216 15:09:31.792113 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="50c7dffe-e977-448f-bcdd-7a68df1cefca" containerName="extract" Dec 16 15:09:31 crc kubenswrapper[4775]: I1216 15:09:31.792134 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c21af7b0-6f27-43de-8c44-6e6519262019" containerName="console" Dec 16 15:09:31 crc kubenswrapper[4775]: I1216 15:09:31.792701 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk" Dec 16 15:09:31 crc kubenswrapper[4775]: I1216 15:09:31.795371 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 16 15:09:31 crc kubenswrapper[4775]: I1216 15:09:31.795756 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-w5xjn" Dec 16 15:09:31 crc kubenswrapper[4775]: I1216 15:09:31.796083 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 16 15:09:31 crc kubenswrapper[4775]: I1216 15:09:31.796221 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 16 15:09:31 crc kubenswrapper[4775]: I1216 15:09:31.796375 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 16 15:09:31 crc kubenswrapper[4775]: I1216 15:09:31.810849 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk"] Dec 16 15:09:31 crc kubenswrapper[4775]: I1216 15:09:31.956934 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csvpq\" (UniqueName: \"kubernetes.io/projected/4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd-kube-api-access-csvpq\") pod \"metallb-operator-controller-manager-6dbcb5f69b-g6llk\" (UID: \"4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd\") " pod="metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk" Dec 16 15:09:31 crc kubenswrapper[4775]: I1216 15:09:31.957004 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd-webhook-cert\") pod \"metallb-operator-controller-manager-6dbcb5f69b-g6llk\" (UID: \"4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd\") " pod="metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk" Dec 16 15:09:31 crc kubenswrapper[4775]: I1216 15:09:31.957036 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd-apiservice-cert\") pod \"metallb-operator-controller-manager-6dbcb5f69b-g6llk\" (UID: \"4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd\") " pod="metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.005206 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs"] Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.005924 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.007878 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-cjvng" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.009594 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.010504 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.031731 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs"] Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.058242 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csvpq\" (UniqueName: \"kubernetes.io/projected/4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd-kube-api-access-csvpq\") pod \"metallb-operator-controller-manager-6dbcb5f69b-g6llk\" (UID: \"4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd\") " pod="metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.058312 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd-webhook-cert\") pod \"metallb-operator-controller-manager-6dbcb5f69b-g6llk\" (UID: \"4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd\") " pod="metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.058344 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd-apiservice-cert\") pod \"metallb-operator-controller-manager-6dbcb5f69b-g6llk\" (UID: \"4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd\") " pod="metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.065505 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd-webhook-cert\") pod \"metallb-operator-controller-manager-6dbcb5f69b-g6llk\" (UID: \"4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd\") " pod="metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.066175 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd-apiservice-cert\") pod \"metallb-operator-controller-manager-6dbcb5f69b-g6llk\" (UID: \"4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd\") " pod="metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.090917 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csvpq\" (UniqueName: \"kubernetes.io/projected/4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd-kube-api-access-csvpq\") pod \"metallb-operator-controller-manager-6dbcb5f69b-g6llk\" (UID: \"4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd\") " pod="metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.128206 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.159852 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q257z\" (UniqueName: \"kubernetes.io/projected/e79447d0-f855-4f85-a021-0618e819f822-kube-api-access-q257z\") pod \"metallb-operator-webhook-server-7889b8b87-lgnbs\" (UID: \"e79447d0-f855-4f85-a021-0618e819f822\") " pod="metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.159926 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e79447d0-f855-4f85-a021-0618e819f822-webhook-cert\") pod \"metallb-operator-webhook-server-7889b8b87-lgnbs\" (UID: \"e79447d0-f855-4f85-a021-0618e819f822\") " pod="metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.159951 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e79447d0-f855-4f85-a021-0618e819f822-apiservice-cert\") pod \"metallb-operator-webhook-server-7889b8b87-lgnbs\" (UID: \"e79447d0-f855-4f85-a021-0618e819f822\") " pod="metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.261308 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q257z\" (UniqueName: \"kubernetes.io/projected/e79447d0-f855-4f85-a021-0618e819f822-kube-api-access-q257z\") pod \"metallb-operator-webhook-server-7889b8b87-lgnbs\" (UID: \"e79447d0-f855-4f85-a021-0618e819f822\") " pod="metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.261362 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e79447d0-f855-4f85-a021-0618e819f822-webhook-cert\") pod \"metallb-operator-webhook-server-7889b8b87-lgnbs\" (UID: \"e79447d0-f855-4f85-a021-0618e819f822\") " pod="metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.261385 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e79447d0-f855-4f85-a021-0618e819f822-apiservice-cert\") pod \"metallb-operator-webhook-server-7889b8b87-lgnbs\" (UID: \"e79447d0-f855-4f85-a021-0618e819f822\") " pod="metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.269725 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e79447d0-f855-4f85-a021-0618e819f822-webhook-cert\") pod \"metallb-operator-webhook-server-7889b8b87-lgnbs\" (UID: \"e79447d0-f855-4f85-a021-0618e819f822\") " pod="metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.274701 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e79447d0-f855-4f85-a021-0618e819f822-apiservice-cert\") pod \"metallb-operator-webhook-server-7889b8b87-lgnbs\" (UID: \"e79447d0-f855-4f85-a021-0618e819f822\") " pod="metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.296667 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q257z\" (UniqueName: \"kubernetes.io/projected/e79447d0-f855-4f85-a021-0618e819f822-kube-api-access-q257z\") pod \"metallb-operator-webhook-server-7889b8b87-lgnbs\" (UID: \"e79447d0-f855-4f85-a021-0618e819f822\") " pod="metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.334921 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs" Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.578735 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs"] Dec 16 15:09:32 crc kubenswrapper[4775]: I1216 15:09:32.619736 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk"] Dec 16 15:09:32 crc kubenswrapper[4775]: W1216 15:09:32.624586 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ca4d9bd_c3ac_4817_bb00_c5b25ec7b4cd.slice/crio-c8a4166d3b63d12490358fee3f129c36b3162ef113eec1cb1518d265b6c305a0 WatchSource:0}: Error finding container c8a4166d3b63d12490358fee3f129c36b3162ef113eec1cb1518d265b6c305a0: Status 404 returned error can't find the container with id c8a4166d3b63d12490358fee3f129c36b3162ef113eec1cb1518d265b6c305a0 Dec 16 15:09:33 crc kubenswrapper[4775]: I1216 15:09:33.478477 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk" event={"ID":"4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd","Type":"ContainerStarted","Data":"c8a4166d3b63d12490358fee3f129c36b3162ef113eec1cb1518d265b6c305a0"} Dec 16 15:09:33 crc kubenswrapper[4775]: I1216 15:09:33.480436 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs" event={"ID":"e79447d0-f855-4f85-a021-0618e819f822","Type":"ContainerStarted","Data":"4fe73b721d8154d00a8de2f051a691847e2176863ba6aa0b4ad77630efe9d3f4"} Dec 16 15:09:40 crc kubenswrapper[4775]: I1216 15:09:40.761292 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk" event={"ID":"4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd","Type":"ContainerStarted","Data":"e37cc86f47e6c0da9da86a7c14bedd8d78493b0357efeea4e527b93d89943447"} Dec 16 15:09:40 crc kubenswrapper[4775]: I1216 15:09:40.762175 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk" Dec 16 15:09:40 crc kubenswrapper[4775]: I1216 15:09:40.764808 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs" event={"ID":"e79447d0-f855-4f85-a021-0618e819f822","Type":"ContainerStarted","Data":"05aa40606b8ffd1bd0c5e55ce331516ec46585d186e92706b31effc8d2a6a964"} Dec 16 15:09:40 crc kubenswrapper[4775]: I1216 15:09:40.807006 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk" podStartSLOduration=2.34334851 podStartE2EDuration="9.806989502s" podCreationTimestamp="2025-12-16 15:09:31 +0000 UTC" firstStartedPulling="2025-12-16 15:09:32.627342756 +0000 UTC m=+897.578421679" lastFinishedPulling="2025-12-16 15:09:40.090983738 +0000 UTC m=+905.042062671" observedRunningTime="2025-12-16 15:09:40.80503772 +0000 UTC m=+905.756116663" watchObservedRunningTime="2025-12-16 15:09:40.806989502 +0000 UTC m=+905.758068425" Dec 16 15:09:40 crc kubenswrapper[4775]: I1216 15:09:40.829348 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs" podStartSLOduration=2.312703488 podStartE2EDuration="9.829327577s" podCreationTimestamp="2025-12-16 15:09:31 +0000 UTC" firstStartedPulling="2025-12-16 15:09:32.595158386 +0000 UTC m=+897.546237299" lastFinishedPulling="2025-12-16 15:09:40.111782455 +0000 UTC m=+905.062861388" observedRunningTime="2025-12-16 15:09:40.826560219 +0000 UTC m=+905.777639162" watchObservedRunningTime="2025-12-16 15:09:40.829327577 +0000 UTC m=+905.780406500" Dec 16 15:09:41 crc kubenswrapper[4775]: I1216 15:09:41.770163 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs" Dec 16 15:09:52 crc kubenswrapper[4775]: I1216 15:09:52.342436 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7889b8b87-lgnbs" Dec 16 15:09:53 crc kubenswrapper[4775]: I1216 15:09:53.722832 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7f22b"] Dec 16 15:09:53 crc kubenswrapper[4775]: I1216 15:09:53.724444 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7f22b" Dec 16 15:09:53 crc kubenswrapper[4775]: I1216 15:09:53.737809 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7f22b"] Dec 16 15:09:53 crc kubenswrapper[4775]: I1216 15:09:53.846480 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74vvc\" (UniqueName: \"kubernetes.io/projected/49fef8e3-d561-4328-adbf-2a9f096f731a-kube-api-access-74vvc\") pod \"certified-operators-7f22b\" (UID: \"49fef8e3-d561-4328-adbf-2a9f096f731a\") " pod="openshift-marketplace/certified-operators-7f22b" Dec 16 15:09:53 crc kubenswrapper[4775]: I1216 15:09:53.846546 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49fef8e3-d561-4328-adbf-2a9f096f731a-utilities\") pod \"certified-operators-7f22b\" (UID: \"49fef8e3-d561-4328-adbf-2a9f096f731a\") " pod="openshift-marketplace/certified-operators-7f22b" Dec 16 15:09:53 crc kubenswrapper[4775]: I1216 15:09:53.846613 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49fef8e3-d561-4328-adbf-2a9f096f731a-catalog-content\") pod \"certified-operators-7f22b\" (UID: \"49fef8e3-d561-4328-adbf-2a9f096f731a\") " pod="openshift-marketplace/certified-operators-7f22b" Dec 16 15:09:53 crc kubenswrapper[4775]: I1216 15:09:53.948061 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74vvc\" (UniqueName: \"kubernetes.io/projected/49fef8e3-d561-4328-adbf-2a9f096f731a-kube-api-access-74vvc\") pod \"certified-operators-7f22b\" (UID: \"49fef8e3-d561-4328-adbf-2a9f096f731a\") " pod="openshift-marketplace/certified-operators-7f22b" Dec 16 15:09:53 crc kubenswrapper[4775]: I1216 15:09:53.948137 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49fef8e3-d561-4328-adbf-2a9f096f731a-utilities\") pod \"certified-operators-7f22b\" (UID: \"49fef8e3-d561-4328-adbf-2a9f096f731a\") " pod="openshift-marketplace/certified-operators-7f22b" Dec 16 15:09:53 crc kubenswrapper[4775]: I1216 15:09:53.948396 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49fef8e3-d561-4328-adbf-2a9f096f731a-catalog-content\") pod \"certified-operators-7f22b\" (UID: \"49fef8e3-d561-4328-adbf-2a9f096f731a\") " pod="openshift-marketplace/certified-operators-7f22b" Dec 16 15:09:53 crc kubenswrapper[4775]: I1216 15:09:53.949036 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49fef8e3-d561-4328-adbf-2a9f096f731a-catalog-content\") pod \"certified-operators-7f22b\" (UID: \"49fef8e3-d561-4328-adbf-2a9f096f731a\") " pod="openshift-marketplace/certified-operators-7f22b" Dec 16 15:09:53 crc kubenswrapper[4775]: I1216 15:09:53.949276 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49fef8e3-d561-4328-adbf-2a9f096f731a-utilities\") pod \"certified-operators-7f22b\" (UID: \"49fef8e3-d561-4328-adbf-2a9f096f731a\") " pod="openshift-marketplace/certified-operators-7f22b" Dec 16 15:09:53 crc kubenswrapper[4775]: I1216 15:09:53.977413 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74vvc\" (UniqueName: \"kubernetes.io/projected/49fef8e3-d561-4328-adbf-2a9f096f731a-kube-api-access-74vvc\") pod \"certified-operators-7f22b\" (UID: \"49fef8e3-d561-4328-adbf-2a9f096f731a\") " pod="openshift-marketplace/certified-operators-7f22b" Dec 16 15:09:54 crc kubenswrapper[4775]: I1216 15:09:54.044464 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7f22b" Dec 16 15:09:54 crc kubenswrapper[4775]: I1216 15:09:54.810723 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7f22b"] Dec 16 15:09:54 crc kubenswrapper[4775]: I1216 15:09:54.866767 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7f22b" event={"ID":"49fef8e3-d561-4328-adbf-2a9f096f731a","Type":"ContainerStarted","Data":"33baa1bcee9f7474bdbab9e8c2efdfa608ffd1b83299f7948c3b425f614ec1c1"} Dec 16 15:09:57 crc kubenswrapper[4775]: I1216 15:09:57.885581 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7f22b" event={"ID":"49fef8e3-d561-4328-adbf-2a9f096f731a","Type":"ContainerStarted","Data":"2934c2ca886424b3732b3a92b0f3217839a4ad39d292c7854ba7e5ee39d8f1c7"} Dec 16 15:09:58 crc kubenswrapper[4775]: I1216 15:09:58.893499 4775 generic.go:334] "Generic (PLEG): container finished" podID="49fef8e3-d561-4328-adbf-2a9f096f731a" containerID="2934c2ca886424b3732b3a92b0f3217839a4ad39d292c7854ba7e5ee39d8f1c7" exitCode=0 Dec 16 15:09:58 crc kubenswrapper[4775]: I1216 15:09:58.893550 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7f22b" event={"ID":"49fef8e3-d561-4328-adbf-2a9f096f731a","Type":"ContainerDied","Data":"2934c2ca886424b3732b3a92b0f3217839a4ad39d292c7854ba7e5ee39d8f1c7"} Dec 16 15:09:59 crc kubenswrapper[4775]: I1216 15:09:59.900767 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7f22b" event={"ID":"49fef8e3-d561-4328-adbf-2a9f096f731a","Type":"ContainerStarted","Data":"e14a9a9b582c35f73da083b678b308fc81a14ac0bf01db63ebd156f35de9fdf1"} Dec 16 15:10:00 crc kubenswrapper[4775]: I1216 15:10:00.119056 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sv5fs"] Dec 16 15:10:00 crc kubenswrapper[4775]: I1216 15:10:00.120229 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sv5fs" Dec 16 15:10:00 crc kubenswrapper[4775]: I1216 15:10:00.210613 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f600eb7-4910-45dc-a250-4641387dc789-catalog-content\") pod \"redhat-marketplace-sv5fs\" (UID: \"5f600eb7-4910-45dc-a250-4641387dc789\") " pod="openshift-marketplace/redhat-marketplace-sv5fs" Dec 16 15:10:00 crc kubenswrapper[4775]: I1216 15:10:00.210711 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88rz7\" (UniqueName: \"kubernetes.io/projected/5f600eb7-4910-45dc-a250-4641387dc789-kube-api-access-88rz7\") pod \"redhat-marketplace-sv5fs\" (UID: \"5f600eb7-4910-45dc-a250-4641387dc789\") " pod="openshift-marketplace/redhat-marketplace-sv5fs" Dec 16 15:10:00 crc kubenswrapper[4775]: I1216 15:10:00.210757 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f600eb7-4910-45dc-a250-4641387dc789-utilities\") pod \"redhat-marketplace-sv5fs\" (UID: \"5f600eb7-4910-45dc-a250-4641387dc789\") " pod="openshift-marketplace/redhat-marketplace-sv5fs" Dec 16 15:10:00 crc kubenswrapper[4775]: I1216 15:10:00.267348 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sv5fs"] Dec 16 15:10:00 crc kubenswrapper[4775]: I1216 15:10:00.312010 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88rz7\" (UniqueName: \"kubernetes.io/projected/5f600eb7-4910-45dc-a250-4641387dc789-kube-api-access-88rz7\") pod \"redhat-marketplace-sv5fs\" (UID: \"5f600eb7-4910-45dc-a250-4641387dc789\") " pod="openshift-marketplace/redhat-marketplace-sv5fs" Dec 16 15:10:00 crc kubenswrapper[4775]: I1216 15:10:00.312132 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f600eb7-4910-45dc-a250-4641387dc789-utilities\") pod \"redhat-marketplace-sv5fs\" (UID: \"5f600eb7-4910-45dc-a250-4641387dc789\") " pod="openshift-marketplace/redhat-marketplace-sv5fs" Dec 16 15:10:00 crc kubenswrapper[4775]: I1216 15:10:00.312186 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f600eb7-4910-45dc-a250-4641387dc789-catalog-content\") pod \"redhat-marketplace-sv5fs\" (UID: \"5f600eb7-4910-45dc-a250-4641387dc789\") " pod="openshift-marketplace/redhat-marketplace-sv5fs" Dec 16 15:10:00 crc kubenswrapper[4775]: I1216 15:10:00.312784 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f600eb7-4910-45dc-a250-4641387dc789-utilities\") pod \"redhat-marketplace-sv5fs\" (UID: \"5f600eb7-4910-45dc-a250-4641387dc789\") " pod="openshift-marketplace/redhat-marketplace-sv5fs" Dec 16 15:10:00 crc kubenswrapper[4775]: I1216 15:10:00.313036 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f600eb7-4910-45dc-a250-4641387dc789-catalog-content\") pod \"redhat-marketplace-sv5fs\" (UID: \"5f600eb7-4910-45dc-a250-4641387dc789\") " pod="openshift-marketplace/redhat-marketplace-sv5fs" Dec 16 15:10:00 crc kubenswrapper[4775]: I1216 15:10:00.449607 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88rz7\" (UniqueName: \"kubernetes.io/projected/5f600eb7-4910-45dc-a250-4641387dc789-kube-api-access-88rz7\") pod \"redhat-marketplace-sv5fs\" (UID: \"5f600eb7-4910-45dc-a250-4641387dc789\") " pod="openshift-marketplace/redhat-marketplace-sv5fs" Dec 16 15:10:00 crc kubenswrapper[4775]: I1216 15:10:00.737924 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sv5fs" Dec 16 15:10:00 crc kubenswrapper[4775]: I1216 15:10:00.908848 4775 generic.go:334] "Generic (PLEG): container finished" podID="49fef8e3-d561-4328-adbf-2a9f096f731a" containerID="e14a9a9b582c35f73da083b678b308fc81a14ac0bf01db63ebd156f35de9fdf1" exitCode=0 Dec 16 15:10:00 crc kubenswrapper[4775]: I1216 15:10:00.909054 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7f22b" event={"ID":"49fef8e3-d561-4328-adbf-2a9f096f731a","Type":"ContainerDied","Data":"e14a9a9b582c35f73da083b678b308fc81a14ac0bf01db63ebd156f35de9fdf1"} Dec 16 15:10:01 crc kubenswrapper[4775]: I1216 15:10:01.247486 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sv5fs"] Dec 16 15:10:01 crc kubenswrapper[4775]: I1216 15:10:01.918234 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7f22b" event={"ID":"49fef8e3-d561-4328-adbf-2a9f096f731a","Type":"ContainerStarted","Data":"22536a7522f3c44819f4fb422f51d926fa417b9712eab60f09378f1b103878f6"} Dec 16 15:10:01 crc kubenswrapper[4775]: I1216 15:10:01.920974 4775 generic.go:334] "Generic (PLEG): container finished" podID="5f600eb7-4910-45dc-a250-4641387dc789" containerID="ff58939e5a023bba6b2e0d14c671bcbb59c4a1ee0da0f26993784d3320508a6e" exitCode=0 Dec 16 15:10:01 crc kubenswrapper[4775]: I1216 15:10:01.921019 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sv5fs" event={"ID":"5f600eb7-4910-45dc-a250-4641387dc789","Type":"ContainerDied","Data":"ff58939e5a023bba6b2e0d14c671bcbb59c4a1ee0da0f26993784d3320508a6e"} Dec 16 15:10:01 crc kubenswrapper[4775]: I1216 15:10:01.921076 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sv5fs" event={"ID":"5f600eb7-4910-45dc-a250-4641387dc789","Type":"ContainerStarted","Data":"c114ecae9eb91f7847286f2d1a3b130d1e063c29f0cc904792568973006e4740"} Dec 16 15:10:01 crc kubenswrapper[4775]: I1216 15:10:01.939035 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7f22b" podStartSLOduration=6.129465883 podStartE2EDuration="8.939011994s" podCreationTimestamp="2025-12-16 15:09:53 +0000 UTC" firstStartedPulling="2025-12-16 15:09:58.895044465 +0000 UTC m=+923.846123388" lastFinishedPulling="2025-12-16 15:10:01.704590556 +0000 UTC m=+926.655669499" observedRunningTime="2025-12-16 15:10:01.936222235 +0000 UTC m=+926.887301178" watchObservedRunningTime="2025-12-16 15:10:01.939011994 +0000 UTC m=+926.890090927" Dec 16 15:10:03 crc kubenswrapper[4775]: I1216 15:10:03.933729 4775 generic.go:334] "Generic (PLEG): container finished" podID="5f600eb7-4910-45dc-a250-4641387dc789" containerID="d08d951ac596a463d3bd07ec9df80a413dcf41be12ee84fd47d7deb9e76776ae" exitCode=0 Dec 16 15:10:03 crc kubenswrapper[4775]: I1216 15:10:03.933811 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sv5fs" event={"ID":"5f600eb7-4910-45dc-a250-4641387dc789","Type":"ContainerDied","Data":"d08d951ac596a463d3bd07ec9df80a413dcf41be12ee84fd47d7deb9e76776ae"} Dec 16 15:10:04 crc kubenswrapper[4775]: I1216 15:10:04.045467 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7f22b" Dec 16 15:10:04 crc kubenswrapper[4775]: I1216 15:10:04.045541 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7f22b" Dec 16 15:10:04 crc kubenswrapper[4775]: I1216 15:10:04.090023 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7f22b" Dec 16 15:10:04 crc kubenswrapper[4775]: I1216 15:10:04.942258 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sv5fs" event={"ID":"5f600eb7-4910-45dc-a250-4641387dc789","Type":"ContainerStarted","Data":"b3c47fcb8a707909d99c240dce49606ad1bfecfda30cab1c189e65555694f1da"} Dec 16 15:10:04 crc kubenswrapper[4775]: I1216 15:10:04.968294 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sv5fs" podStartSLOduration=2.52303997 podStartE2EDuration="4.968275011s" podCreationTimestamp="2025-12-16 15:10:00 +0000 UTC" firstStartedPulling="2025-12-16 15:10:01.922622339 +0000 UTC m=+926.873701272" lastFinishedPulling="2025-12-16 15:10:04.36785739 +0000 UTC m=+929.318936313" observedRunningTime="2025-12-16 15:10:04.964483249 +0000 UTC m=+929.915562192" watchObservedRunningTime="2025-12-16 15:10:04.968275011 +0000 UTC m=+929.919353934" Dec 16 15:10:10 crc kubenswrapper[4775]: I1216 15:10:10.236852 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fttv2"] Dec 16 15:10:10 crc kubenswrapper[4775]: I1216 15:10:10.238446 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fttv2" Dec 16 15:10:10 crc kubenswrapper[4775]: I1216 15:10:10.260637 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6mfw\" (UniqueName: \"kubernetes.io/projected/69dd7d91-c613-4a03-ade3-805ce6879564-kube-api-access-k6mfw\") pod \"community-operators-fttv2\" (UID: \"69dd7d91-c613-4a03-ade3-805ce6879564\") " pod="openshift-marketplace/community-operators-fttv2" Dec 16 15:10:10 crc kubenswrapper[4775]: I1216 15:10:10.261027 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69dd7d91-c613-4a03-ade3-805ce6879564-utilities\") pod \"community-operators-fttv2\" (UID: \"69dd7d91-c613-4a03-ade3-805ce6879564\") " pod="openshift-marketplace/community-operators-fttv2" Dec 16 15:10:10 crc kubenswrapper[4775]: I1216 15:10:10.261258 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69dd7d91-c613-4a03-ade3-805ce6879564-catalog-content\") pod \"community-operators-fttv2\" (UID: \"69dd7d91-c613-4a03-ade3-805ce6879564\") " pod="openshift-marketplace/community-operators-fttv2" Dec 16 15:10:10 crc kubenswrapper[4775]: I1216 15:10:10.266517 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fttv2"] Dec 16 15:10:10 crc kubenswrapper[4775]: I1216 15:10:10.362458 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6mfw\" (UniqueName: \"kubernetes.io/projected/69dd7d91-c613-4a03-ade3-805ce6879564-kube-api-access-k6mfw\") pod \"community-operators-fttv2\" (UID: \"69dd7d91-c613-4a03-ade3-805ce6879564\") " pod="openshift-marketplace/community-operators-fttv2" Dec 16 15:10:10 crc kubenswrapper[4775]: I1216 15:10:10.362507 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69dd7d91-c613-4a03-ade3-805ce6879564-utilities\") pod \"community-operators-fttv2\" (UID: \"69dd7d91-c613-4a03-ade3-805ce6879564\") " pod="openshift-marketplace/community-operators-fttv2" Dec 16 15:10:10 crc kubenswrapper[4775]: I1216 15:10:10.362578 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69dd7d91-c613-4a03-ade3-805ce6879564-catalog-content\") pod \"community-operators-fttv2\" (UID: \"69dd7d91-c613-4a03-ade3-805ce6879564\") " pod="openshift-marketplace/community-operators-fttv2" Dec 16 15:10:10 crc kubenswrapper[4775]: I1216 15:10:10.364590 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69dd7d91-c613-4a03-ade3-805ce6879564-utilities\") pod \"community-operators-fttv2\" (UID: \"69dd7d91-c613-4a03-ade3-805ce6879564\") " pod="openshift-marketplace/community-operators-fttv2" Dec 16 15:10:10 crc kubenswrapper[4775]: I1216 15:10:10.365421 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69dd7d91-c613-4a03-ade3-805ce6879564-catalog-content\") pod \"community-operators-fttv2\" (UID: \"69dd7d91-c613-4a03-ade3-805ce6879564\") " pod="openshift-marketplace/community-operators-fttv2" Dec 16 15:10:10 crc kubenswrapper[4775]: I1216 15:10:10.394771 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6mfw\" (UniqueName: \"kubernetes.io/projected/69dd7d91-c613-4a03-ade3-805ce6879564-kube-api-access-k6mfw\") pod \"community-operators-fttv2\" (UID: \"69dd7d91-c613-4a03-ade3-805ce6879564\") " pod="openshift-marketplace/community-operators-fttv2" Dec 16 15:10:10 crc kubenswrapper[4775]: I1216 15:10:10.645429 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fttv2" Dec 16 15:10:10 crc kubenswrapper[4775]: I1216 15:10:10.808139 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sv5fs" Dec 16 15:10:10 crc kubenswrapper[4775]: I1216 15:10:10.808188 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sv5fs" Dec 16 15:10:10 crc kubenswrapper[4775]: I1216 15:10:10.914766 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sv5fs" Dec 16 15:10:11 crc kubenswrapper[4775]: I1216 15:10:11.056756 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sv5fs" Dec 16 15:10:11 crc kubenswrapper[4775]: I1216 15:10:11.118529 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fttv2"] Dec 16 15:10:11 crc kubenswrapper[4775]: I1216 15:10:11.992613 4775 generic.go:334] "Generic (PLEG): container finished" podID="69dd7d91-c613-4a03-ade3-805ce6879564" containerID="06620ea294ae4114bc4b41b08eb0489b0f61c87a21d574b314d8da42a8994a33" exitCode=0 Dec 16 15:10:11 crc kubenswrapper[4775]: I1216 15:10:11.992784 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fttv2" event={"ID":"69dd7d91-c613-4a03-ade3-805ce6879564","Type":"ContainerDied","Data":"06620ea294ae4114bc4b41b08eb0489b0f61c87a21d574b314d8da42a8994a33"} Dec 16 15:10:11 crc kubenswrapper[4775]: I1216 15:10:11.993108 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fttv2" event={"ID":"69dd7d91-c613-4a03-ade3-805ce6879564","Type":"ContainerStarted","Data":"7039d00a2584b4c2a56d5becededb757a2372173aa3c7ca8661041c5109e3fa8"} Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.135005 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6dbcb5f69b-g6llk" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.822578 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-6hfpf"] Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.823553 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-6hfpf" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.832301 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.832560 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-nqgkk" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.837130 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-2txt9"] Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.839769 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.840229 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8mrj\" (UniqueName: \"kubernetes.io/projected/b55d23ff-e3e6-460c-8058-0489204c8a4d-kube-api-access-d8mrj\") pod \"frr-k8s-webhook-server-7784b6fcf-6hfpf\" (UID: \"b55d23ff-e3e6-460c-8058-0489204c8a4d\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-6hfpf" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.840285 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b55d23ff-e3e6-460c-8058-0489204c8a4d-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-6hfpf\" (UID: \"b55d23ff-e3e6-460c-8058-0489204c8a4d\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-6hfpf" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.842910 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-6hfpf"] Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.842931 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.847103 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.908231 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-hzvbb"] Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.909370 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-hzvbb" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.916745 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.916779 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-wvhzz" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.916792 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.916853 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.922612 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5bddd4b946-khxxp"] Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.923740 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-khxxp" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.925956 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.940754 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d80b883c-02c1-4d56-a369-addb8c7bfdca-frr-conf\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.940794 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d80b883c-02c1-4d56-a369-addb8c7bfdca-frr-startup\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.940820 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b55d23ff-e3e6-460c-8058-0489204c8a4d-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-6hfpf\" (UID: \"b55d23ff-e3e6-460c-8058-0489204c8a4d\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-6hfpf" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.940846 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d80b883c-02c1-4d56-a369-addb8c7bfdca-frr-sockets\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.940869 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2516e125-5678-4a01-8a6b-1f8865b69f77-memberlist\") pod \"speaker-hzvbb\" (UID: \"2516e125-5678-4a01-8a6b-1f8865b69f77\") " pod="metallb-system/speaker-hzvbb" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.940943 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d80b883c-02c1-4d56-a369-addb8c7bfdca-reloader\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.940963 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d80b883c-02c1-4d56-a369-addb8c7bfdca-metrics\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.940979 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwkgc\" (UniqueName: \"kubernetes.io/projected/aa78cbdc-f63e-4010-9bdc-88715f997591-kube-api-access-jwkgc\") pod \"controller-5bddd4b946-khxxp\" (UID: \"aa78cbdc-f63e-4010-9bdc-88715f997591\") " pod="metallb-system/controller-5bddd4b946-khxxp" Dec 16 15:10:12 crc kubenswrapper[4775]: E1216 15:10:12.940988 4775 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 16 15:10:12 crc kubenswrapper[4775]: E1216 15:10:12.941069 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b55d23ff-e3e6-460c-8058-0489204c8a4d-cert podName:b55d23ff-e3e6-460c-8058-0489204c8a4d nodeName:}" failed. No retries permitted until 2025-12-16 15:10:13.441043609 +0000 UTC m=+938.392122602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b55d23ff-e3e6-460c-8058-0489204c8a4d-cert") pod "frr-k8s-webhook-server-7784b6fcf-6hfpf" (UID: "b55d23ff-e3e6-460c-8058-0489204c8a4d") : secret "frr-k8s-webhook-server-cert" not found Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.941001 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d80b883c-02c1-4d56-a369-addb8c7bfdca-metrics-certs\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.941153 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa78cbdc-f63e-4010-9bdc-88715f997591-metrics-certs\") pod \"controller-5bddd4b946-khxxp\" (UID: \"aa78cbdc-f63e-4010-9bdc-88715f997591\") " pod="metallb-system/controller-5bddd4b946-khxxp" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.941265 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgvgx\" (UniqueName: \"kubernetes.io/projected/2516e125-5678-4a01-8a6b-1f8865b69f77-kube-api-access-kgvgx\") pod \"speaker-hzvbb\" (UID: \"2516e125-5678-4a01-8a6b-1f8865b69f77\") " pod="metallb-system/speaker-hzvbb" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.941343 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2516e125-5678-4a01-8a6b-1f8865b69f77-metrics-certs\") pod \"speaker-hzvbb\" (UID: \"2516e125-5678-4a01-8a6b-1f8865b69f77\") " pod="metallb-system/speaker-hzvbb" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.941403 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa78cbdc-f63e-4010-9bdc-88715f997591-cert\") pod \"controller-5bddd4b946-khxxp\" (UID: \"aa78cbdc-f63e-4010-9bdc-88715f997591\") " pod="metallb-system/controller-5bddd4b946-khxxp" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.941460 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2516e125-5678-4a01-8a6b-1f8865b69f77-metallb-excludel2\") pod \"speaker-hzvbb\" (UID: \"2516e125-5678-4a01-8a6b-1f8865b69f77\") " pod="metallb-system/speaker-hzvbb" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.941548 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql8jr\" (UniqueName: \"kubernetes.io/projected/d80b883c-02c1-4d56-a369-addb8c7bfdca-kube-api-access-ql8jr\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.941662 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8mrj\" (UniqueName: \"kubernetes.io/projected/b55d23ff-e3e6-460c-8058-0489204c8a4d-kube-api-access-d8mrj\") pod \"frr-k8s-webhook-server-7784b6fcf-6hfpf\" (UID: \"b55d23ff-e3e6-460c-8058-0489204c8a4d\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-6hfpf" Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.947477 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-khxxp"] Dec 16 15:10:12 crc kubenswrapper[4775]: I1216 15:10:12.967036 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8mrj\" (UniqueName: \"kubernetes.io/projected/b55d23ff-e3e6-460c-8058-0489204c8a4d-kube-api-access-d8mrj\") pod \"frr-k8s-webhook-server-7784b6fcf-6hfpf\" (UID: \"b55d23ff-e3e6-460c-8058-0489204c8a4d\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-6hfpf" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.043161 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa78cbdc-f63e-4010-9bdc-88715f997591-cert\") pod \"controller-5bddd4b946-khxxp\" (UID: \"aa78cbdc-f63e-4010-9bdc-88715f997591\") " pod="metallb-system/controller-5bddd4b946-khxxp" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.043224 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2516e125-5678-4a01-8a6b-1f8865b69f77-metallb-excludel2\") pod \"speaker-hzvbb\" (UID: \"2516e125-5678-4a01-8a6b-1f8865b69f77\") " pod="metallb-system/speaker-hzvbb" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.043260 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql8jr\" (UniqueName: \"kubernetes.io/projected/d80b883c-02c1-4d56-a369-addb8c7bfdca-kube-api-access-ql8jr\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.043302 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d80b883c-02c1-4d56-a369-addb8c7bfdca-frr-conf\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.043330 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d80b883c-02c1-4d56-a369-addb8c7bfdca-frr-startup\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.043389 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d80b883c-02c1-4d56-a369-addb8c7bfdca-frr-sockets\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.043428 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2516e125-5678-4a01-8a6b-1f8865b69f77-memberlist\") pod \"speaker-hzvbb\" (UID: \"2516e125-5678-4a01-8a6b-1f8865b69f77\") " pod="metallb-system/speaker-hzvbb" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.043456 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d80b883c-02c1-4d56-a369-addb8c7bfdca-reloader\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.043484 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d80b883c-02c1-4d56-a369-addb8c7bfdca-metrics\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.043507 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d80b883c-02c1-4d56-a369-addb8c7bfdca-metrics-certs\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.043529 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwkgc\" (UniqueName: \"kubernetes.io/projected/aa78cbdc-f63e-4010-9bdc-88715f997591-kube-api-access-jwkgc\") pod \"controller-5bddd4b946-khxxp\" (UID: \"aa78cbdc-f63e-4010-9bdc-88715f997591\") " pod="metallb-system/controller-5bddd4b946-khxxp" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.043551 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa78cbdc-f63e-4010-9bdc-88715f997591-metrics-certs\") pod \"controller-5bddd4b946-khxxp\" (UID: \"aa78cbdc-f63e-4010-9bdc-88715f997591\") " pod="metallb-system/controller-5bddd4b946-khxxp" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.043587 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgvgx\" (UniqueName: \"kubernetes.io/projected/2516e125-5678-4a01-8a6b-1f8865b69f77-kube-api-access-kgvgx\") pod \"speaker-hzvbb\" (UID: \"2516e125-5678-4a01-8a6b-1f8865b69f77\") " pod="metallb-system/speaker-hzvbb" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.043613 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2516e125-5678-4a01-8a6b-1f8865b69f77-metrics-certs\") pod \"speaker-hzvbb\" (UID: \"2516e125-5678-4a01-8a6b-1f8865b69f77\") " pod="metallb-system/speaker-hzvbb" Dec 16 15:10:13 crc kubenswrapper[4775]: E1216 15:10:13.043758 4775 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 16 15:10:13 crc kubenswrapper[4775]: E1216 15:10:13.043823 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2516e125-5678-4a01-8a6b-1f8865b69f77-metrics-certs podName:2516e125-5678-4a01-8a6b-1f8865b69f77 nodeName:}" failed. No retries permitted until 2025-12-16 15:10:13.5438007 +0000 UTC m=+938.494879623 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2516e125-5678-4a01-8a6b-1f8865b69f77-metrics-certs") pod "speaker-hzvbb" (UID: "2516e125-5678-4a01-8a6b-1f8865b69f77") : secret "speaker-certs-secret" not found Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.043959 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d80b883c-02c1-4d56-a369-addb8c7bfdca-frr-conf\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:13 crc kubenswrapper[4775]: E1216 15:10:13.044081 4775 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 16 15:10:13 crc kubenswrapper[4775]: E1216 15:10:13.044090 4775 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 16 15:10:13 crc kubenswrapper[4775]: E1216 15:10:13.044121 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa78cbdc-f63e-4010-9bdc-88715f997591-metrics-certs podName:aa78cbdc-f63e-4010-9bdc-88715f997591 nodeName:}" failed. No retries permitted until 2025-12-16 15:10:13.54411124 +0000 UTC m=+938.495190163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa78cbdc-f63e-4010-9bdc-88715f997591-metrics-certs") pod "controller-5bddd4b946-khxxp" (UID: "aa78cbdc-f63e-4010-9bdc-88715f997591") : secret "controller-certs-secret" not found Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.044087 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d80b883c-02c1-4d56-a369-addb8c7bfdca-metrics\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:13 crc kubenswrapper[4775]: E1216 15:10:13.044146 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2516e125-5678-4a01-8a6b-1f8865b69f77-memberlist podName:2516e125-5678-4a01-8a6b-1f8865b69f77 nodeName:}" failed. No retries permitted until 2025-12-16 15:10:13.544134211 +0000 UTC m=+938.495213244 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2516e125-5678-4a01-8a6b-1f8865b69f77-memberlist") pod "speaker-hzvbb" (UID: "2516e125-5678-4a01-8a6b-1f8865b69f77") : secret "metallb-memberlist" not found Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.044266 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2516e125-5678-4a01-8a6b-1f8865b69f77-metallb-excludel2\") pod \"speaker-hzvbb\" (UID: \"2516e125-5678-4a01-8a6b-1f8865b69f77\") " pod="metallb-system/speaker-hzvbb" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.044329 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d80b883c-02c1-4d56-a369-addb8c7bfdca-reloader\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.044498 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d80b883c-02c1-4d56-a369-addb8c7bfdca-frr-sockets\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.044650 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d80b883c-02c1-4d56-a369-addb8c7bfdca-frr-startup\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.047417 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.047794 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d80b883c-02c1-4d56-a369-addb8c7bfdca-metrics-certs\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.060312 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa78cbdc-f63e-4010-9bdc-88715f997591-cert\") pod \"controller-5bddd4b946-khxxp\" (UID: \"aa78cbdc-f63e-4010-9bdc-88715f997591\") " pod="metallb-system/controller-5bddd4b946-khxxp" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.068564 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgvgx\" (UniqueName: \"kubernetes.io/projected/2516e125-5678-4a01-8a6b-1f8865b69f77-kube-api-access-kgvgx\") pod \"speaker-hzvbb\" (UID: \"2516e125-5678-4a01-8a6b-1f8865b69f77\") " pod="metallb-system/speaker-hzvbb" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.069003 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwkgc\" (UniqueName: \"kubernetes.io/projected/aa78cbdc-f63e-4010-9bdc-88715f997591-kube-api-access-jwkgc\") pod \"controller-5bddd4b946-khxxp\" (UID: \"aa78cbdc-f63e-4010-9bdc-88715f997591\") " pod="metallb-system/controller-5bddd4b946-khxxp" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.074580 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql8jr\" (UniqueName: \"kubernetes.io/projected/d80b883c-02c1-4d56-a369-addb8c7bfdca-kube-api-access-ql8jr\") pod \"frr-k8s-2txt9\" (UID: \"d80b883c-02c1-4d56-a369-addb8c7bfdca\") " pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.160786 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.212696 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sv5fs"] Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.213007 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sv5fs" podUID="5f600eb7-4910-45dc-a250-4641387dc789" containerName="registry-server" containerID="cri-o://b3c47fcb8a707909d99c240dce49606ad1bfecfda30cab1c189e65555694f1da" gracePeriod=2 Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.448772 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b55d23ff-e3e6-460c-8058-0489204c8a4d-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-6hfpf\" (UID: \"b55d23ff-e3e6-460c-8058-0489204c8a4d\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-6hfpf" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.452362 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b55d23ff-e3e6-460c-8058-0489204c8a4d-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-6hfpf\" (UID: \"b55d23ff-e3e6-460c-8058-0489204c8a4d\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-6hfpf" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.549720 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2516e125-5678-4a01-8a6b-1f8865b69f77-metrics-certs\") pod \"speaker-hzvbb\" (UID: \"2516e125-5678-4a01-8a6b-1f8865b69f77\") " pod="metallb-system/speaker-hzvbb" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.549834 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2516e125-5678-4a01-8a6b-1f8865b69f77-memberlist\") pod \"speaker-hzvbb\" (UID: \"2516e125-5678-4a01-8a6b-1f8865b69f77\") " pod="metallb-system/speaker-hzvbb" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.549869 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa78cbdc-f63e-4010-9bdc-88715f997591-metrics-certs\") pod \"controller-5bddd4b946-khxxp\" (UID: \"aa78cbdc-f63e-4010-9bdc-88715f997591\") " pod="metallb-system/controller-5bddd4b946-khxxp" Dec 16 15:10:13 crc kubenswrapper[4775]: E1216 15:10:13.550315 4775 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 16 15:10:13 crc kubenswrapper[4775]: E1216 15:10:13.550420 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2516e125-5678-4a01-8a6b-1f8865b69f77-memberlist podName:2516e125-5678-4a01-8a6b-1f8865b69f77 nodeName:}" failed. No retries permitted until 2025-12-16 15:10:14.550392707 +0000 UTC m=+939.501471650 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2516e125-5678-4a01-8a6b-1f8865b69f77-memberlist") pod "speaker-hzvbb" (UID: "2516e125-5678-4a01-8a6b-1f8865b69f77") : secret "metallb-memberlist" not found Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.555507 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa78cbdc-f63e-4010-9bdc-88715f997591-metrics-certs\") pod \"controller-5bddd4b946-khxxp\" (UID: \"aa78cbdc-f63e-4010-9bdc-88715f997591\") " pod="metallb-system/controller-5bddd4b946-khxxp" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.556158 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2516e125-5678-4a01-8a6b-1f8865b69f77-metrics-certs\") pod \"speaker-hzvbb\" (UID: \"2516e125-5678-4a01-8a6b-1f8865b69f77\") " pod="metallb-system/speaker-hzvbb" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.752537 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-6hfpf" Dec 16 15:10:13 crc kubenswrapper[4775]: I1216 15:10:13.837278 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-khxxp" Dec 16 15:10:14 crc kubenswrapper[4775]: I1216 15:10:14.021765 4775 generic.go:334] "Generic (PLEG): container finished" podID="5f600eb7-4910-45dc-a250-4641387dc789" containerID="b3c47fcb8a707909d99c240dce49606ad1bfecfda30cab1c189e65555694f1da" exitCode=0 Dec 16 15:10:14 crc kubenswrapper[4775]: I1216 15:10:14.022084 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sv5fs" event={"ID":"5f600eb7-4910-45dc-a250-4641387dc789","Type":"ContainerDied","Data":"b3c47fcb8a707909d99c240dce49606ad1bfecfda30cab1c189e65555694f1da"} Dec 16 15:10:14 crc kubenswrapper[4775]: I1216 15:10:14.022914 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2txt9" event={"ID":"d80b883c-02c1-4d56-a369-addb8c7bfdca","Type":"ContainerStarted","Data":"97950008ba22fb5cbd1807f4bc39a44a070bd7ffa5238d894d78c1471781d2db"} Dec 16 15:10:14 crc kubenswrapper[4775]: I1216 15:10:14.024247 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fttv2" event={"ID":"69dd7d91-c613-4a03-ade3-805ce6879564","Type":"ContainerStarted","Data":"55792c476b9e1d02632fa9470960a8d4d363885363a57535055b1b45bf623b5f"} Dec 16 15:10:14 crc kubenswrapper[4775]: I1216 15:10:14.146947 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7f22b" Dec 16 15:10:14 crc kubenswrapper[4775]: I1216 15:10:14.232974 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-6hfpf"] Dec 16 15:10:14 crc kubenswrapper[4775]: W1216 15:10:14.499565 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa78cbdc_f63e_4010_9bdc_88715f997591.slice/crio-b9259e9c37a0c7a9a82fa94a2a2f12535859da9b46e0cd25f21ef7bb5c53400a WatchSource:0}: Error finding container b9259e9c37a0c7a9a82fa94a2a2f12535859da9b46e0cd25f21ef7bb5c53400a: Status 404 returned error can't find the container with id b9259e9c37a0c7a9a82fa94a2a2f12535859da9b46e0cd25f21ef7bb5c53400a Dec 16 15:10:14 crc kubenswrapper[4775]: I1216 15:10:14.500582 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-khxxp"] Dec 16 15:10:14 crc kubenswrapper[4775]: I1216 15:10:14.569492 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2516e125-5678-4a01-8a6b-1f8865b69f77-memberlist\") pod \"speaker-hzvbb\" (UID: \"2516e125-5678-4a01-8a6b-1f8865b69f77\") " pod="metallb-system/speaker-hzvbb" Dec 16 15:10:14 crc kubenswrapper[4775]: E1216 15:10:14.569703 4775 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 16 15:10:14 crc kubenswrapper[4775]: E1216 15:10:14.569764 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2516e125-5678-4a01-8a6b-1f8865b69f77-memberlist podName:2516e125-5678-4a01-8a6b-1f8865b69f77 nodeName:}" failed. No retries permitted until 2025-12-16 15:10:16.569743167 +0000 UTC m=+941.520822090 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2516e125-5678-4a01-8a6b-1f8865b69f77-memberlist") pod "speaker-hzvbb" (UID: "2516e125-5678-4a01-8a6b-1f8865b69f77") : secret "metallb-memberlist" not found Dec 16 15:10:14 crc kubenswrapper[4775]: I1216 15:10:14.712057 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sv5fs" Dec 16 15:10:14 crc kubenswrapper[4775]: I1216 15:10:14.873526 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88rz7\" (UniqueName: \"kubernetes.io/projected/5f600eb7-4910-45dc-a250-4641387dc789-kube-api-access-88rz7\") pod \"5f600eb7-4910-45dc-a250-4641387dc789\" (UID: \"5f600eb7-4910-45dc-a250-4641387dc789\") " Dec 16 15:10:14 crc kubenswrapper[4775]: I1216 15:10:14.874014 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f600eb7-4910-45dc-a250-4641387dc789-catalog-content\") pod \"5f600eb7-4910-45dc-a250-4641387dc789\" (UID: \"5f600eb7-4910-45dc-a250-4641387dc789\") " Dec 16 15:10:14 crc kubenswrapper[4775]: I1216 15:10:14.874060 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f600eb7-4910-45dc-a250-4641387dc789-utilities\") pod \"5f600eb7-4910-45dc-a250-4641387dc789\" (UID: \"5f600eb7-4910-45dc-a250-4641387dc789\") " Dec 16 15:10:14 crc kubenswrapper[4775]: I1216 15:10:14.875021 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f600eb7-4910-45dc-a250-4641387dc789-utilities" (OuterVolumeSpecName: "utilities") pod "5f600eb7-4910-45dc-a250-4641387dc789" (UID: "5f600eb7-4910-45dc-a250-4641387dc789"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:10:14 crc kubenswrapper[4775]: I1216 15:10:14.882186 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f600eb7-4910-45dc-a250-4641387dc789-kube-api-access-88rz7" (OuterVolumeSpecName: "kube-api-access-88rz7") pod "5f600eb7-4910-45dc-a250-4641387dc789" (UID: "5f600eb7-4910-45dc-a250-4641387dc789"). InnerVolumeSpecName "kube-api-access-88rz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:10:14 crc kubenswrapper[4775]: I1216 15:10:14.896424 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f600eb7-4910-45dc-a250-4641387dc789-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f600eb7-4910-45dc-a250-4641387dc789" (UID: "5f600eb7-4910-45dc-a250-4641387dc789"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:10:14 crc kubenswrapper[4775]: I1216 15:10:14.975574 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88rz7\" (UniqueName: \"kubernetes.io/projected/5f600eb7-4910-45dc-a250-4641387dc789-kube-api-access-88rz7\") on node \"crc\" DevicePath \"\"" Dec 16 15:10:14 crc kubenswrapper[4775]: I1216 15:10:14.975614 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f600eb7-4910-45dc-a250-4641387dc789-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:10:14 crc kubenswrapper[4775]: I1216 15:10:14.975625 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f600eb7-4910-45dc-a250-4641387dc789-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:10:15 crc kubenswrapper[4775]: I1216 15:10:15.034476 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-khxxp" event={"ID":"aa78cbdc-f63e-4010-9bdc-88715f997591","Type":"ContainerStarted","Data":"274857b14ecc8e540e7152a12eda948b4b3a773c4478b673fd96ee1f4818a4ee"} Dec 16 15:10:15 crc kubenswrapper[4775]: I1216 15:10:15.034545 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-khxxp" event={"ID":"aa78cbdc-f63e-4010-9bdc-88715f997591","Type":"ContainerStarted","Data":"b9259e9c37a0c7a9a82fa94a2a2f12535859da9b46e0cd25f21ef7bb5c53400a"} Dec 16 15:10:15 crc kubenswrapper[4775]: I1216 15:10:15.036052 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-6hfpf" event={"ID":"b55d23ff-e3e6-460c-8058-0489204c8a4d","Type":"ContainerStarted","Data":"ba88d561eca9871e555a6e9d6845a0b8db7faea3af6d0fbd0b2a0c6e1c0b4281"} Dec 16 15:10:15 crc kubenswrapper[4775]: I1216 15:10:15.039445 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sv5fs" event={"ID":"5f600eb7-4910-45dc-a250-4641387dc789","Type":"ContainerDied","Data":"c114ecae9eb91f7847286f2d1a3b130d1e063c29f0cc904792568973006e4740"} Dec 16 15:10:15 crc kubenswrapper[4775]: I1216 15:10:15.039513 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sv5fs" Dec 16 15:10:15 crc kubenswrapper[4775]: I1216 15:10:15.039533 4775 scope.go:117] "RemoveContainer" containerID="b3c47fcb8a707909d99c240dce49606ad1bfecfda30cab1c189e65555694f1da" Dec 16 15:10:15 crc kubenswrapper[4775]: I1216 15:10:15.042313 4775 generic.go:334] "Generic (PLEG): container finished" podID="69dd7d91-c613-4a03-ade3-805ce6879564" containerID="55792c476b9e1d02632fa9470960a8d4d363885363a57535055b1b45bf623b5f" exitCode=0 Dec 16 15:10:15 crc kubenswrapper[4775]: I1216 15:10:15.042362 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fttv2" event={"ID":"69dd7d91-c613-4a03-ade3-805ce6879564","Type":"ContainerDied","Data":"55792c476b9e1d02632fa9470960a8d4d363885363a57535055b1b45bf623b5f"} Dec 16 15:10:15 crc kubenswrapper[4775]: I1216 15:10:15.060626 4775 scope.go:117] "RemoveContainer" containerID="d08d951ac596a463d3bd07ec9df80a413dcf41be12ee84fd47d7deb9e76776ae" Dec 16 15:10:15 crc kubenswrapper[4775]: I1216 15:10:15.105621 4775 scope.go:117] "RemoveContainer" containerID="ff58939e5a023bba6b2e0d14c671bcbb59c4a1ee0da0f26993784d3320508a6e" Dec 16 15:10:15 crc kubenswrapper[4775]: I1216 15:10:15.120603 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sv5fs"] Dec 16 15:10:15 crc kubenswrapper[4775]: I1216 15:10:15.125823 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sv5fs"] Dec 16 15:10:15 crc kubenswrapper[4775]: I1216 15:10:15.346899 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f600eb7-4910-45dc-a250-4641387dc789" path="/var/lib/kubelet/pods/5f600eb7-4910-45dc-a250-4641387dc789/volumes" Dec 16 15:10:16 crc kubenswrapper[4775]: I1216 15:10:16.051366 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-khxxp" event={"ID":"aa78cbdc-f63e-4010-9bdc-88715f997591","Type":"ContainerStarted","Data":"a862f63a5318960be694da66a3b1d6ee1d48d125b439b8be0acccdf0f4ae0323"} Dec 16 15:10:16 crc kubenswrapper[4775]: I1216 15:10:16.051847 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5bddd4b946-khxxp" Dec 16 15:10:16 crc kubenswrapper[4775]: I1216 15:10:16.062146 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fttv2" event={"ID":"69dd7d91-c613-4a03-ade3-805ce6879564","Type":"ContainerStarted","Data":"4c51919e7dfcf9747ee66ffdc9338d4244cd76fff4a7c3d7dafe65136d1184c0"} Dec 16 15:10:16 crc kubenswrapper[4775]: I1216 15:10:16.110526 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5bddd4b946-khxxp" podStartSLOduration=4.110505658 podStartE2EDuration="4.110505658s" podCreationTimestamp="2025-12-16 15:10:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:10:16.081395735 +0000 UTC m=+941.032474668" watchObservedRunningTime="2025-12-16 15:10:16.110505658 +0000 UTC m=+941.061584581" Dec 16 15:10:16 crc kubenswrapper[4775]: I1216 15:10:16.114180 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fttv2" podStartSLOduration=2.4287507010000002 podStartE2EDuration="6.114159155s" podCreationTimestamp="2025-12-16 15:10:10 +0000 UTC" firstStartedPulling="2025-12-16 15:10:11.994789171 +0000 UTC m=+936.945868094" lastFinishedPulling="2025-12-16 15:10:15.680197625 +0000 UTC m=+940.631276548" observedRunningTime="2025-12-16 15:10:16.110013592 +0000 UTC m=+941.061092535" watchObservedRunningTime="2025-12-16 15:10:16.114159155 +0000 UTC m=+941.065238078" Dec 16 15:10:16 crc kubenswrapper[4775]: I1216 15:10:16.605485 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2516e125-5678-4a01-8a6b-1f8865b69f77-memberlist\") pod \"speaker-hzvbb\" (UID: \"2516e125-5678-4a01-8a6b-1f8865b69f77\") " pod="metallb-system/speaker-hzvbb" Dec 16 15:10:16 crc kubenswrapper[4775]: I1216 15:10:16.610382 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2516e125-5678-4a01-8a6b-1f8865b69f77-memberlist\") pod \"speaker-hzvbb\" (UID: \"2516e125-5678-4a01-8a6b-1f8865b69f77\") " pod="metallb-system/speaker-hzvbb" Dec 16 15:10:16 crc kubenswrapper[4775]: I1216 15:10:16.829333 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-hzvbb" Dec 16 15:10:17 crc kubenswrapper[4775]: I1216 15:10:17.083484 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hzvbb" event={"ID":"2516e125-5678-4a01-8a6b-1f8865b69f77","Type":"ContainerStarted","Data":"09cb4cae5d2f890fbfcac64ca9e0cb8f44738348c51b9e765ef24a98439a0e50"} Dec 16 15:10:17 crc kubenswrapper[4775]: I1216 15:10:17.610914 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7f22b"] Dec 16 15:10:17 crc kubenswrapper[4775]: I1216 15:10:17.611205 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7f22b" podUID="49fef8e3-d561-4328-adbf-2a9f096f731a" containerName="registry-server" containerID="cri-o://22536a7522f3c44819f4fb422f51d926fa417b9712eab60f09378f1b103878f6" gracePeriod=2 Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.065343 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7f22b" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.114501 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hzvbb" event={"ID":"2516e125-5678-4a01-8a6b-1f8865b69f77","Type":"ContainerStarted","Data":"63e425ed30f85da8d1a3ccc137d163380a8284c0062e3e01eb17545dbc959a8d"} Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.114558 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hzvbb" event={"ID":"2516e125-5678-4a01-8a6b-1f8865b69f77","Type":"ContainerStarted","Data":"7b7e23bc4f0a7bd3beed21c7893333c94933619f33e9d59fc44edea6acba90dd"} Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.115538 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-hzvbb" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.129195 4775 generic.go:334] "Generic (PLEG): container finished" podID="49fef8e3-d561-4328-adbf-2a9f096f731a" containerID="22536a7522f3c44819f4fb422f51d926fa417b9712eab60f09378f1b103878f6" exitCode=0 Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.129248 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7f22b" event={"ID":"49fef8e3-d561-4328-adbf-2a9f096f731a","Type":"ContainerDied","Data":"22536a7522f3c44819f4fb422f51d926fa417b9712eab60f09378f1b103878f6"} Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.129278 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7f22b" event={"ID":"49fef8e3-d561-4328-adbf-2a9f096f731a","Type":"ContainerDied","Data":"33baa1bcee9f7474bdbab9e8c2efdfa608ffd1b83299f7948c3b425f614ec1c1"} Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.129297 4775 scope.go:117] "RemoveContainer" containerID="22536a7522f3c44819f4fb422f51d926fa417b9712eab60f09378f1b103878f6" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.129327 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7f22b" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.150712 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49fef8e3-d561-4328-adbf-2a9f096f731a-catalog-content\") pod \"49fef8e3-d561-4328-adbf-2a9f096f731a\" (UID: \"49fef8e3-d561-4328-adbf-2a9f096f731a\") " Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.150794 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74vvc\" (UniqueName: \"kubernetes.io/projected/49fef8e3-d561-4328-adbf-2a9f096f731a-kube-api-access-74vvc\") pod \"49fef8e3-d561-4328-adbf-2a9f096f731a\" (UID: \"49fef8e3-d561-4328-adbf-2a9f096f731a\") " Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.150857 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49fef8e3-d561-4328-adbf-2a9f096f731a-utilities\") pod \"49fef8e3-d561-4328-adbf-2a9f096f731a\" (UID: \"49fef8e3-d561-4328-adbf-2a9f096f731a\") " Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.152673 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49fef8e3-d561-4328-adbf-2a9f096f731a-utilities" (OuterVolumeSpecName: "utilities") pod "49fef8e3-d561-4328-adbf-2a9f096f731a" (UID: "49fef8e3-d561-4328-adbf-2a9f096f731a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.160264 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-hzvbb" podStartSLOduration=6.160241101 podStartE2EDuration="6.160241101s" podCreationTimestamp="2025-12-16 15:10:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:10:18.156541023 +0000 UTC m=+943.107619956" watchObservedRunningTime="2025-12-16 15:10:18.160241101 +0000 UTC m=+943.111320024" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.164658 4775 scope.go:117] "RemoveContainer" containerID="e14a9a9b582c35f73da083b678b308fc81a14ac0bf01db63ebd156f35de9fdf1" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.165023 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49fef8e3-d561-4328-adbf-2a9f096f731a-kube-api-access-74vvc" (OuterVolumeSpecName: "kube-api-access-74vvc") pod "49fef8e3-d561-4328-adbf-2a9f096f731a" (UID: "49fef8e3-d561-4328-adbf-2a9f096f731a"). InnerVolumeSpecName "kube-api-access-74vvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.222985 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49fef8e3-d561-4328-adbf-2a9f096f731a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49fef8e3-d561-4328-adbf-2a9f096f731a" (UID: "49fef8e3-d561-4328-adbf-2a9f096f731a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.242958 4775 scope.go:117] "RemoveContainer" containerID="2934c2ca886424b3732b3a92b0f3217839a4ad39d292c7854ba7e5ee39d8f1c7" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.253105 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49fef8e3-d561-4328-adbf-2a9f096f731a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.253145 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74vvc\" (UniqueName: \"kubernetes.io/projected/49fef8e3-d561-4328-adbf-2a9f096f731a-kube-api-access-74vvc\") on node \"crc\" DevicePath \"\"" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.253157 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49fef8e3-d561-4328-adbf-2a9f096f731a-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.267048 4775 scope.go:117] "RemoveContainer" containerID="22536a7522f3c44819f4fb422f51d926fa417b9712eab60f09378f1b103878f6" Dec 16 15:10:18 crc kubenswrapper[4775]: E1216 15:10:18.268708 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22536a7522f3c44819f4fb422f51d926fa417b9712eab60f09378f1b103878f6\": container with ID starting with 22536a7522f3c44819f4fb422f51d926fa417b9712eab60f09378f1b103878f6 not found: ID does not exist" containerID="22536a7522f3c44819f4fb422f51d926fa417b9712eab60f09378f1b103878f6" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.268760 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22536a7522f3c44819f4fb422f51d926fa417b9712eab60f09378f1b103878f6"} err="failed to get container status \"22536a7522f3c44819f4fb422f51d926fa417b9712eab60f09378f1b103878f6\": rpc error: code = NotFound desc = could not find container \"22536a7522f3c44819f4fb422f51d926fa417b9712eab60f09378f1b103878f6\": container with ID starting with 22536a7522f3c44819f4fb422f51d926fa417b9712eab60f09378f1b103878f6 not found: ID does not exist" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.268793 4775 scope.go:117] "RemoveContainer" containerID="e14a9a9b582c35f73da083b678b308fc81a14ac0bf01db63ebd156f35de9fdf1" Dec 16 15:10:18 crc kubenswrapper[4775]: E1216 15:10:18.269076 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e14a9a9b582c35f73da083b678b308fc81a14ac0bf01db63ebd156f35de9fdf1\": container with ID starting with e14a9a9b582c35f73da083b678b308fc81a14ac0bf01db63ebd156f35de9fdf1 not found: ID does not exist" containerID="e14a9a9b582c35f73da083b678b308fc81a14ac0bf01db63ebd156f35de9fdf1" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.269115 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e14a9a9b582c35f73da083b678b308fc81a14ac0bf01db63ebd156f35de9fdf1"} err="failed to get container status \"e14a9a9b582c35f73da083b678b308fc81a14ac0bf01db63ebd156f35de9fdf1\": rpc error: code = NotFound desc = could not find container \"e14a9a9b582c35f73da083b678b308fc81a14ac0bf01db63ebd156f35de9fdf1\": container with ID starting with e14a9a9b582c35f73da083b678b308fc81a14ac0bf01db63ebd156f35de9fdf1 not found: ID does not exist" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.269221 4775 scope.go:117] "RemoveContainer" containerID="2934c2ca886424b3732b3a92b0f3217839a4ad39d292c7854ba7e5ee39d8f1c7" Dec 16 15:10:18 crc kubenswrapper[4775]: E1216 15:10:18.269657 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2934c2ca886424b3732b3a92b0f3217839a4ad39d292c7854ba7e5ee39d8f1c7\": container with ID starting with 2934c2ca886424b3732b3a92b0f3217839a4ad39d292c7854ba7e5ee39d8f1c7 not found: ID does not exist" containerID="2934c2ca886424b3732b3a92b0f3217839a4ad39d292c7854ba7e5ee39d8f1c7" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.269693 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2934c2ca886424b3732b3a92b0f3217839a4ad39d292c7854ba7e5ee39d8f1c7"} err="failed to get container status \"2934c2ca886424b3732b3a92b0f3217839a4ad39d292c7854ba7e5ee39d8f1c7\": rpc error: code = NotFound desc = could not find container \"2934c2ca886424b3732b3a92b0f3217839a4ad39d292c7854ba7e5ee39d8f1c7\": container with ID starting with 2934c2ca886424b3732b3a92b0f3217839a4ad39d292c7854ba7e5ee39d8f1c7 not found: ID does not exist" Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.470293 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7f22b"] Dec 16 15:10:18 crc kubenswrapper[4775]: I1216 15:10:18.475698 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7f22b"] Dec 16 15:10:19 crc kubenswrapper[4775]: I1216 15:10:19.346861 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49fef8e3-d561-4328-adbf-2a9f096f731a" path="/var/lib/kubelet/pods/49fef8e3-d561-4328-adbf-2a9f096f731a/volumes" Dec 16 15:10:20 crc kubenswrapper[4775]: I1216 15:10:20.646326 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fttv2" Dec 16 15:10:20 crc kubenswrapper[4775]: I1216 15:10:20.646696 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fttv2" Dec 16 15:10:20 crc kubenswrapper[4775]: I1216 15:10:20.687782 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fttv2" Dec 16 15:10:21 crc kubenswrapper[4775]: I1216 15:10:21.226800 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fttv2" Dec 16 15:10:22 crc kubenswrapper[4775]: I1216 15:10:22.012687 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fttv2"] Dec 16 15:10:23 crc kubenswrapper[4775]: I1216 15:10:23.176237 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-6hfpf" event={"ID":"b55d23ff-e3e6-460c-8058-0489204c8a4d","Type":"ContainerStarted","Data":"db9c692e685c382da87daf75da534b3b4e4430ac7192012c66316b919c437e22"} Dec 16 15:10:23 crc kubenswrapper[4775]: I1216 15:10:23.176311 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-6hfpf" Dec 16 15:10:23 crc kubenswrapper[4775]: I1216 15:10:23.178206 4775 generic.go:334] "Generic (PLEG): container finished" podID="d80b883c-02c1-4d56-a369-addb8c7bfdca" containerID="842a9ffffff5257e1e5bd4ab6377068d918c0a077b5a620a00513d2f438b6a33" exitCode=0 Dec 16 15:10:23 crc kubenswrapper[4775]: I1216 15:10:23.178261 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2txt9" event={"ID":"d80b883c-02c1-4d56-a369-addb8c7bfdca","Type":"ContainerDied","Data":"842a9ffffff5257e1e5bd4ab6377068d918c0a077b5a620a00513d2f438b6a33"} Dec 16 15:10:23 crc kubenswrapper[4775]: I1216 15:10:23.178395 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fttv2" podUID="69dd7d91-c613-4a03-ade3-805ce6879564" containerName="registry-server" containerID="cri-o://4c51919e7dfcf9747ee66ffdc9338d4244cd76fff4a7c3d7dafe65136d1184c0" gracePeriod=2 Dec 16 15:10:23 crc kubenswrapper[4775]: I1216 15:10:23.197216 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-6hfpf" podStartSLOduration=2.583940452 podStartE2EDuration="11.197193676s" podCreationTimestamp="2025-12-16 15:10:12 +0000 UTC" firstStartedPulling="2025-12-16 15:10:14.244018574 +0000 UTC m=+939.195097497" lastFinishedPulling="2025-12-16 15:10:22.857271798 +0000 UTC m=+947.808350721" observedRunningTime="2025-12-16 15:10:23.191801863 +0000 UTC m=+948.142880786" watchObservedRunningTime="2025-12-16 15:10:23.197193676 +0000 UTC m=+948.148272599" Dec 16 15:10:23 crc kubenswrapper[4775]: I1216 15:10:23.554584 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fttv2" Dec 16 15:10:23 crc kubenswrapper[4775]: I1216 15:10:23.637073 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69dd7d91-c613-4a03-ade3-805ce6879564-catalog-content\") pod \"69dd7d91-c613-4a03-ade3-805ce6879564\" (UID: \"69dd7d91-c613-4a03-ade3-805ce6879564\") " Dec 16 15:10:23 crc kubenswrapper[4775]: I1216 15:10:23.637155 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6mfw\" (UniqueName: \"kubernetes.io/projected/69dd7d91-c613-4a03-ade3-805ce6879564-kube-api-access-k6mfw\") pod \"69dd7d91-c613-4a03-ade3-805ce6879564\" (UID: \"69dd7d91-c613-4a03-ade3-805ce6879564\") " Dec 16 15:10:23 crc kubenswrapper[4775]: I1216 15:10:23.637231 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69dd7d91-c613-4a03-ade3-805ce6879564-utilities\") pod \"69dd7d91-c613-4a03-ade3-805ce6879564\" (UID: \"69dd7d91-c613-4a03-ade3-805ce6879564\") " Dec 16 15:10:23 crc kubenswrapper[4775]: I1216 15:10:23.638250 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69dd7d91-c613-4a03-ade3-805ce6879564-utilities" (OuterVolumeSpecName: "utilities") pod "69dd7d91-c613-4a03-ade3-805ce6879564" (UID: "69dd7d91-c613-4a03-ade3-805ce6879564"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:10:23 crc kubenswrapper[4775]: I1216 15:10:23.645348 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69dd7d91-c613-4a03-ade3-805ce6879564-kube-api-access-k6mfw" (OuterVolumeSpecName: "kube-api-access-k6mfw") pod "69dd7d91-c613-4a03-ade3-805ce6879564" (UID: "69dd7d91-c613-4a03-ade3-805ce6879564"). InnerVolumeSpecName "kube-api-access-k6mfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:10:23 crc kubenswrapper[4775]: I1216 15:10:23.695242 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69dd7d91-c613-4a03-ade3-805ce6879564-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69dd7d91-c613-4a03-ade3-805ce6879564" (UID: "69dd7d91-c613-4a03-ade3-805ce6879564"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:10:23 crc kubenswrapper[4775]: I1216 15:10:23.738802 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6mfw\" (UniqueName: \"kubernetes.io/projected/69dd7d91-c613-4a03-ade3-805ce6879564-kube-api-access-k6mfw\") on node \"crc\" DevicePath \"\"" Dec 16 15:10:23 crc kubenswrapper[4775]: I1216 15:10:23.738834 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69dd7d91-c613-4a03-ade3-805ce6879564-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:10:23 crc kubenswrapper[4775]: I1216 15:10:23.738844 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69dd7d91-c613-4a03-ade3-805ce6879564-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:10:24 crc kubenswrapper[4775]: I1216 15:10:24.188512 4775 generic.go:334] "Generic (PLEG): container finished" podID="d80b883c-02c1-4d56-a369-addb8c7bfdca" containerID="d00895a9cb731522f7af7a2fc51709fd3e373afc7dac8632c245b15c1edf0532" exitCode=0 Dec 16 15:10:24 crc kubenswrapper[4775]: I1216 15:10:24.189059 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2txt9" event={"ID":"d80b883c-02c1-4d56-a369-addb8c7bfdca","Type":"ContainerDied","Data":"d00895a9cb731522f7af7a2fc51709fd3e373afc7dac8632c245b15c1edf0532"} Dec 16 15:10:24 crc kubenswrapper[4775]: I1216 15:10:24.192912 4775 generic.go:334] "Generic (PLEG): container finished" podID="69dd7d91-c613-4a03-ade3-805ce6879564" containerID="4c51919e7dfcf9747ee66ffdc9338d4244cd76fff4a7c3d7dafe65136d1184c0" exitCode=0 Dec 16 15:10:24 crc kubenswrapper[4775]: I1216 15:10:24.193491 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fttv2" Dec 16 15:10:24 crc kubenswrapper[4775]: I1216 15:10:24.196079 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fttv2" event={"ID":"69dd7d91-c613-4a03-ade3-805ce6879564","Type":"ContainerDied","Data":"4c51919e7dfcf9747ee66ffdc9338d4244cd76fff4a7c3d7dafe65136d1184c0"} Dec 16 15:10:24 crc kubenswrapper[4775]: I1216 15:10:24.196148 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fttv2" event={"ID":"69dd7d91-c613-4a03-ade3-805ce6879564","Type":"ContainerDied","Data":"7039d00a2584b4c2a56d5becededb757a2372173aa3c7ca8661041c5109e3fa8"} Dec 16 15:10:24 crc kubenswrapper[4775]: I1216 15:10:24.196189 4775 scope.go:117] "RemoveContainer" containerID="4c51919e7dfcf9747ee66ffdc9338d4244cd76fff4a7c3d7dafe65136d1184c0" Dec 16 15:10:24 crc kubenswrapper[4775]: I1216 15:10:24.224996 4775 scope.go:117] "RemoveContainer" containerID="55792c476b9e1d02632fa9470960a8d4d363885363a57535055b1b45bf623b5f" Dec 16 15:10:24 crc kubenswrapper[4775]: I1216 15:10:24.239059 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fttv2"] Dec 16 15:10:24 crc kubenswrapper[4775]: I1216 15:10:24.244190 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fttv2"] Dec 16 15:10:24 crc kubenswrapper[4775]: I1216 15:10:24.264651 4775 scope.go:117] "RemoveContainer" containerID="06620ea294ae4114bc4b41b08eb0489b0f61c87a21d574b314d8da42a8994a33" Dec 16 15:10:24 crc kubenswrapper[4775]: I1216 15:10:24.283817 4775 scope.go:117] "RemoveContainer" containerID="4c51919e7dfcf9747ee66ffdc9338d4244cd76fff4a7c3d7dafe65136d1184c0" Dec 16 15:10:24 crc kubenswrapper[4775]: E1216 15:10:24.284344 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c51919e7dfcf9747ee66ffdc9338d4244cd76fff4a7c3d7dafe65136d1184c0\": container with ID starting with 4c51919e7dfcf9747ee66ffdc9338d4244cd76fff4a7c3d7dafe65136d1184c0 not found: ID does not exist" containerID="4c51919e7dfcf9747ee66ffdc9338d4244cd76fff4a7c3d7dafe65136d1184c0" Dec 16 15:10:24 crc kubenswrapper[4775]: I1216 15:10:24.284455 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c51919e7dfcf9747ee66ffdc9338d4244cd76fff4a7c3d7dafe65136d1184c0"} err="failed to get container status \"4c51919e7dfcf9747ee66ffdc9338d4244cd76fff4a7c3d7dafe65136d1184c0\": rpc error: code = NotFound desc = could not find container \"4c51919e7dfcf9747ee66ffdc9338d4244cd76fff4a7c3d7dafe65136d1184c0\": container with ID starting with 4c51919e7dfcf9747ee66ffdc9338d4244cd76fff4a7c3d7dafe65136d1184c0 not found: ID does not exist" Dec 16 15:10:24 crc kubenswrapper[4775]: I1216 15:10:24.284542 4775 scope.go:117] "RemoveContainer" containerID="55792c476b9e1d02632fa9470960a8d4d363885363a57535055b1b45bf623b5f" Dec 16 15:10:24 crc kubenswrapper[4775]: E1216 15:10:24.284824 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55792c476b9e1d02632fa9470960a8d4d363885363a57535055b1b45bf623b5f\": container with ID starting with 55792c476b9e1d02632fa9470960a8d4d363885363a57535055b1b45bf623b5f not found: ID does not exist" containerID="55792c476b9e1d02632fa9470960a8d4d363885363a57535055b1b45bf623b5f" Dec 16 15:10:24 crc kubenswrapper[4775]: I1216 15:10:24.284918 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55792c476b9e1d02632fa9470960a8d4d363885363a57535055b1b45bf623b5f"} err="failed to get container status \"55792c476b9e1d02632fa9470960a8d4d363885363a57535055b1b45bf623b5f\": rpc error: code = NotFound desc = could not find container \"55792c476b9e1d02632fa9470960a8d4d363885363a57535055b1b45bf623b5f\": container with ID starting with 55792c476b9e1d02632fa9470960a8d4d363885363a57535055b1b45bf623b5f not found: ID does not exist" Dec 16 15:10:24 crc kubenswrapper[4775]: I1216 15:10:24.284991 4775 scope.go:117] "RemoveContainer" containerID="06620ea294ae4114bc4b41b08eb0489b0f61c87a21d574b314d8da42a8994a33" Dec 16 15:10:24 crc kubenswrapper[4775]: E1216 15:10:24.285280 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06620ea294ae4114bc4b41b08eb0489b0f61c87a21d574b314d8da42a8994a33\": container with ID starting with 06620ea294ae4114bc4b41b08eb0489b0f61c87a21d574b314d8da42a8994a33 not found: ID does not exist" containerID="06620ea294ae4114bc4b41b08eb0489b0f61c87a21d574b314d8da42a8994a33" Dec 16 15:10:24 crc kubenswrapper[4775]: I1216 15:10:24.285363 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06620ea294ae4114bc4b41b08eb0489b0f61c87a21d574b314d8da42a8994a33"} err="failed to get container status \"06620ea294ae4114bc4b41b08eb0489b0f61c87a21d574b314d8da42a8994a33\": rpc error: code = NotFound desc = could not find container \"06620ea294ae4114bc4b41b08eb0489b0f61c87a21d574b314d8da42a8994a33\": container with ID starting with 06620ea294ae4114bc4b41b08eb0489b0f61c87a21d574b314d8da42a8994a33 not found: ID does not exist" Dec 16 15:10:25 crc kubenswrapper[4775]: I1216 15:10:25.200411 4775 generic.go:334] "Generic (PLEG): container finished" podID="d80b883c-02c1-4d56-a369-addb8c7bfdca" containerID="05bfc77508851ea1928997c931f363085a465d3e3eda90869d82b3be65228c9c" exitCode=0 Dec 16 15:10:25 crc kubenswrapper[4775]: I1216 15:10:25.200465 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2txt9" event={"ID":"d80b883c-02c1-4d56-a369-addb8c7bfdca","Type":"ContainerDied","Data":"05bfc77508851ea1928997c931f363085a465d3e3eda90869d82b3be65228c9c"} Dec 16 15:10:25 crc kubenswrapper[4775]: I1216 15:10:25.345567 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69dd7d91-c613-4a03-ade3-805ce6879564" path="/var/lib/kubelet/pods/69dd7d91-c613-4a03-ade3-805ce6879564/volumes" Dec 16 15:10:26 crc kubenswrapper[4775]: I1216 15:10:26.210158 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2txt9" event={"ID":"d80b883c-02c1-4d56-a369-addb8c7bfdca","Type":"ContainerStarted","Data":"6732c79dd1f15058b82558e962c5a1676c457c50d8a9bc41352bde8558fa358e"} Dec 16 15:10:26 crc kubenswrapper[4775]: I1216 15:10:26.210198 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2txt9" event={"ID":"d80b883c-02c1-4d56-a369-addb8c7bfdca","Type":"ContainerStarted","Data":"bc500f1fd11000a76e1f7c2e465a3af8d88f123b3f1ef749d37a9ed6e334f68a"} Dec 16 15:10:28 crc kubenswrapper[4775]: I1216 15:10:28.226210 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2txt9" event={"ID":"d80b883c-02c1-4d56-a369-addb8c7bfdca","Type":"ContainerStarted","Data":"109e0d2e822536d2a8c9911d17ab74b0a82be669736bb4f56d32e1c9bb1e5d81"} Dec 16 15:10:28 crc kubenswrapper[4775]: I1216 15:10:28.226849 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:28 crc kubenswrapper[4775]: I1216 15:10:28.226875 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2txt9" event={"ID":"d80b883c-02c1-4d56-a369-addb8c7bfdca","Type":"ContainerStarted","Data":"537a7b4f3e30189ed55b0b546989c8dd485e36d56048dc5ef18dbffb5c545db3"} Dec 16 15:10:28 crc kubenswrapper[4775]: I1216 15:10:28.226922 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2txt9" event={"ID":"d80b883c-02c1-4d56-a369-addb8c7bfdca","Type":"ContainerStarted","Data":"c03ddd9e9b6d617fadda5b2765d7d28ebd6da9420c25bb7dd0ce4ecfd16aa17f"} Dec 16 15:10:28 crc kubenswrapper[4775]: I1216 15:10:28.226943 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2txt9" event={"ID":"d80b883c-02c1-4d56-a369-addb8c7bfdca","Type":"ContainerStarted","Data":"07c890da2dbd8be6a2dbcb7c5950f6fedca71ecb822e4b2e88b4bb486b994342"} Dec 16 15:10:28 crc kubenswrapper[4775]: I1216 15:10:28.256010 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-2txt9" podStartSLOduration=6.92852998 podStartE2EDuration="16.25597261s" podCreationTimestamp="2025-12-16 15:10:12 +0000 UTC" firstStartedPulling="2025-12-16 15:10:13.550525181 +0000 UTC m=+938.501604104" lastFinishedPulling="2025-12-16 15:10:22.877967801 +0000 UTC m=+947.829046734" observedRunningTime="2025-12-16 15:10:28.249241273 +0000 UTC m=+953.200320226" watchObservedRunningTime="2025-12-16 15:10:28.25597261 +0000 UTC m=+953.207051543" Dec 16 15:10:33 crc kubenswrapper[4775]: I1216 15:10:33.162027 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:33 crc kubenswrapper[4775]: I1216 15:10:33.230411 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:33 crc kubenswrapper[4775]: I1216 15:10:33.768356 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-6hfpf" Dec 16 15:10:33 crc kubenswrapper[4775]: I1216 15:10:33.841955 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5bddd4b946-khxxp" Dec 16 15:10:36 crc kubenswrapper[4775]: I1216 15:10:36.833790 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-hzvbb" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.533877 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wvn99"] Dec 16 15:10:39 crc kubenswrapper[4775]: E1216 15:10:39.534636 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fef8e3-d561-4328-adbf-2a9f096f731a" containerName="extract-utilities" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.534661 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fef8e3-d561-4328-adbf-2a9f096f731a" containerName="extract-utilities" Dec 16 15:10:39 crc kubenswrapper[4775]: E1216 15:10:39.534683 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f600eb7-4910-45dc-a250-4641387dc789" containerName="extract-utilities" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.534695 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f600eb7-4910-45dc-a250-4641387dc789" containerName="extract-utilities" Dec 16 15:10:39 crc kubenswrapper[4775]: E1216 15:10:39.534718 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f600eb7-4910-45dc-a250-4641387dc789" containerName="extract-content" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.534728 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f600eb7-4910-45dc-a250-4641387dc789" containerName="extract-content" Dec 16 15:10:39 crc kubenswrapper[4775]: E1216 15:10:39.534744 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69dd7d91-c613-4a03-ade3-805ce6879564" containerName="extract-utilities" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.534755 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="69dd7d91-c613-4a03-ade3-805ce6879564" containerName="extract-utilities" Dec 16 15:10:39 crc kubenswrapper[4775]: E1216 15:10:39.534783 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69dd7d91-c613-4a03-ade3-805ce6879564" containerName="registry-server" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.534794 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="69dd7d91-c613-4a03-ade3-805ce6879564" containerName="registry-server" Dec 16 15:10:39 crc kubenswrapper[4775]: E1216 15:10:39.534811 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69dd7d91-c613-4a03-ade3-805ce6879564" containerName="extract-content" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.534822 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="69dd7d91-c613-4a03-ade3-805ce6879564" containerName="extract-content" Dec 16 15:10:39 crc kubenswrapper[4775]: E1216 15:10:39.534838 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f600eb7-4910-45dc-a250-4641387dc789" containerName="registry-server" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.534848 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f600eb7-4910-45dc-a250-4641387dc789" containerName="registry-server" Dec 16 15:10:39 crc kubenswrapper[4775]: E1216 15:10:39.534866 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fef8e3-d561-4328-adbf-2a9f096f731a" containerName="registry-server" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.534877 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fef8e3-d561-4328-adbf-2a9f096f731a" containerName="registry-server" Dec 16 15:10:39 crc kubenswrapper[4775]: E1216 15:10:39.534919 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fef8e3-d561-4328-adbf-2a9f096f731a" containerName="extract-content" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.534931 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fef8e3-d561-4328-adbf-2a9f096f731a" containerName="extract-content" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.535131 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="69dd7d91-c613-4a03-ade3-805ce6879564" containerName="registry-server" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.535156 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fef8e3-d561-4328-adbf-2a9f096f731a" containerName="registry-server" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.535175 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f600eb7-4910-45dc-a250-4641387dc789" containerName="registry-server" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.535869 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wvn99" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.537973 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-tlcv9" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.538542 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.538830 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.552359 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wvn99"] Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.659780 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnccd\" (UniqueName: \"kubernetes.io/projected/87ae0f39-1000-4f35-a953-7060b7286b71-kube-api-access-fnccd\") pod \"openstack-operator-index-wvn99\" (UID: \"87ae0f39-1000-4f35-a953-7060b7286b71\") " pod="openstack-operators/openstack-operator-index-wvn99" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.761287 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnccd\" (UniqueName: \"kubernetes.io/projected/87ae0f39-1000-4f35-a953-7060b7286b71-kube-api-access-fnccd\") pod \"openstack-operator-index-wvn99\" (UID: \"87ae0f39-1000-4f35-a953-7060b7286b71\") " pod="openstack-operators/openstack-operator-index-wvn99" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.786107 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnccd\" (UniqueName: \"kubernetes.io/projected/87ae0f39-1000-4f35-a953-7060b7286b71-kube-api-access-fnccd\") pod \"openstack-operator-index-wvn99\" (UID: \"87ae0f39-1000-4f35-a953-7060b7286b71\") " pod="openstack-operators/openstack-operator-index-wvn99" Dec 16 15:10:39 crc kubenswrapper[4775]: I1216 15:10:39.861803 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wvn99" Dec 16 15:10:40 crc kubenswrapper[4775]: I1216 15:10:40.263413 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wvn99"] Dec 16 15:10:40 crc kubenswrapper[4775]: I1216 15:10:40.338697 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wvn99" event={"ID":"87ae0f39-1000-4f35-a953-7060b7286b71","Type":"ContainerStarted","Data":"df073a673cf76b824e3ee9da2ec221700aeff62cae02b65f79224ef5abfa0c97"} Dec 16 15:10:43 crc kubenswrapper[4775]: I1216 15:10:43.167738 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-2txt9" Dec 16 15:10:43 crc kubenswrapper[4775]: I1216 15:10:43.358569 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wvn99" event={"ID":"87ae0f39-1000-4f35-a953-7060b7286b71","Type":"ContainerStarted","Data":"08ae9501ac1fff3f419167d6edfc25dee5cdf552c83a476747e9b00b7d12b20e"} Dec 16 15:10:43 crc kubenswrapper[4775]: I1216 15:10:43.381170 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wvn99" podStartSLOduration=1.771003184 podStartE2EDuration="4.381147177s" podCreationTimestamp="2025-12-16 15:10:39 +0000 UTC" firstStartedPulling="2025-12-16 15:10:40.274100538 +0000 UTC m=+965.225179461" lastFinishedPulling="2025-12-16 15:10:42.884244531 +0000 UTC m=+967.835323454" observedRunningTime="2025-12-16 15:10:43.373368718 +0000 UTC m=+968.324447661" watchObservedRunningTime="2025-12-16 15:10:43.381147177 +0000 UTC m=+968.332226100" Dec 16 15:10:43 crc kubenswrapper[4775]: I1216 15:10:43.714107 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wvn99"] Dec 16 15:10:44 crc kubenswrapper[4775]: I1216 15:10:44.528658 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nxcc2"] Dec 16 15:10:44 crc kubenswrapper[4775]: I1216 15:10:44.530058 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nxcc2" Dec 16 15:10:44 crc kubenswrapper[4775]: I1216 15:10:44.545625 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nxcc2"] Dec 16 15:10:44 crc kubenswrapper[4775]: I1216 15:10:44.635919 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs256\" (UniqueName: \"kubernetes.io/projected/16d64d82-cfc5-461a-a39a-48fd77562a54-kube-api-access-xs256\") pod \"openstack-operator-index-nxcc2\" (UID: \"16d64d82-cfc5-461a-a39a-48fd77562a54\") " pod="openstack-operators/openstack-operator-index-nxcc2" Dec 16 15:10:44 crc kubenswrapper[4775]: I1216 15:10:44.737508 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs256\" (UniqueName: \"kubernetes.io/projected/16d64d82-cfc5-461a-a39a-48fd77562a54-kube-api-access-xs256\") pod \"openstack-operator-index-nxcc2\" (UID: \"16d64d82-cfc5-461a-a39a-48fd77562a54\") " pod="openstack-operators/openstack-operator-index-nxcc2" Dec 16 15:10:44 crc kubenswrapper[4775]: I1216 15:10:44.770013 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs256\" (UniqueName: \"kubernetes.io/projected/16d64d82-cfc5-461a-a39a-48fd77562a54-kube-api-access-xs256\") pod \"openstack-operator-index-nxcc2\" (UID: \"16d64d82-cfc5-461a-a39a-48fd77562a54\") " pod="openstack-operators/openstack-operator-index-nxcc2" Dec 16 15:10:44 crc kubenswrapper[4775]: I1216 15:10:44.866808 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nxcc2" Dec 16 15:10:45 crc kubenswrapper[4775]: I1216 15:10:45.289681 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nxcc2"] Dec 16 15:10:45 crc kubenswrapper[4775]: W1216 15:10:45.301717 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16d64d82_cfc5_461a_a39a_48fd77562a54.slice/crio-d5cd5452e73b6f66c32d69c0cd0694798657de2119c47b5ea3de5e1c5e33c0cc WatchSource:0}: Error finding container d5cd5452e73b6f66c32d69c0cd0694798657de2119c47b5ea3de5e1c5e33c0cc: Status 404 returned error can't find the container with id d5cd5452e73b6f66c32d69c0cd0694798657de2119c47b5ea3de5e1c5e33c0cc Dec 16 15:10:45 crc kubenswrapper[4775]: I1216 15:10:45.374021 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nxcc2" event={"ID":"16d64d82-cfc5-461a-a39a-48fd77562a54","Type":"ContainerStarted","Data":"d5cd5452e73b6f66c32d69c0cd0694798657de2119c47b5ea3de5e1c5e33c0cc"} Dec 16 15:10:45 crc kubenswrapper[4775]: I1216 15:10:45.374192 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-wvn99" podUID="87ae0f39-1000-4f35-a953-7060b7286b71" containerName="registry-server" containerID="cri-o://08ae9501ac1fff3f419167d6edfc25dee5cdf552c83a476747e9b00b7d12b20e" gracePeriod=2 Dec 16 15:10:45 crc kubenswrapper[4775]: I1216 15:10:45.933810 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wvn99" Dec 16 15:10:46 crc kubenswrapper[4775]: I1216 15:10:46.057197 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnccd\" (UniqueName: \"kubernetes.io/projected/87ae0f39-1000-4f35-a953-7060b7286b71-kube-api-access-fnccd\") pod \"87ae0f39-1000-4f35-a953-7060b7286b71\" (UID: \"87ae0f39-1000-4f35-a953-7060b7286b71\") " Dec 16 15:10:46 crc kubenswrapper[4775]: I1216 15:10:46.064814 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87ae0f39-1000-4f35-a953-7060b7286b71-kube-api-access-fnccd" (OuterVolumeSpecName: "kube-api-access-fnccd") pod "87ae0f39-1000-4f35-a953-7060b7286b71" (UID: "87ae0f39-1000-4f35-a953-7060b7286b71"). InnerVolumeSpecName "kube-api-access-fnccd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:10:46 crc kubenswrapper[4775]: I1216 15:10:46.159871 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnccd\" (UniqueName: \"kubernetes.io/projected/87ae0f39-1000-4f35-a953-7060b7286b71-kube-api-access-fnccd\") on node \"crc\" DevicePath \"\"" Dec 16 15:10:46 crc kubenswrapper[4775]: I1216 15:10:46.386390 4775 generic.go:334] "Generic (PLEG): container finished" podID="87ae0f39-1000-4f35-a953-7060b7286b71" containerID="08ae9501ac1fff3f419167d6edfc25dee5cdf552c83a476747e9b00b7d12b20e" exitCode=0 Dec 16 15:10:46 crc kubenswrapper[4775]: I1216 15:10:46.386504 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wvn99" event={"ID":"87ae0f39-1000-4f35-a953-7060b7286b71","Type":"ContainerDied","Data":"08ae9501ac1fff3f419167d6edfc25dee5cdf552c83a476747e9b00b7d12b20e"} Dec 16 15:10:46 crc kubenswrapper[4775]: I1216 15:10:46.386550 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wvn99" event={"ID":"87ae0f39-1000-4f35-a953-7060b7286b71","Type":"ContainerDied","Data":"df073a673cf76b824e3ee9da2ec221700aeff62cae02b65f79224ef5abfa0c97"} Dec 16 15:10:46 crc kubenswrapper[4775]: I1216 15:10:46.386580 4775 scope.go:117] "RemoveContainer" containerID="08ae9501ac1fff3f419167d6edfc25dee5cdf552c83a476747e9b00b7d12b20e" Dec 16 15:10:46 crc kubenswrapper[4775]: I1216 15:10:46.386978 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wvn99" Dec 16 15:10:46 crc kubenswrapper[4775]: I1216 15:10:46.389614 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nxcc2" event={"ID":"16d64d82-cfc5-461a-a39a-48fd77562a54","Type":"ContainerStarted","Data":"69806aa6b4222beebab703676c8fa734c3c4adde743c584abb46281bf4ef07de"} Dec 16 15:10:46 crc kubenswrapper[4775]: I1216 15:10:46.417040 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nxcc2" podStartSLOduration=2.325047701 podStartE2EDuration="2.417012986s" podCreationTimestamp="2025-12-16 15:10:44 +0000 UTC" firstStartedPulling="2025-12-16 15:10:45.306739993 +0000 UTC m=+970.257818926" lastFinishedPulling="2025-12-16 15:10:45.398705298 +0000 UTC m=+970.349784211" observedRunningTime="2025-12-16 15:10:46.409188074 +0000 UTC m=+971.360267087" watchObservedRunningTime="2025-12-16 15:10:46.417012986 +0000 UTC m=+971.368091929" Dec 16 15:10:46 crc kubenswrapper[4775]: I1216 15:10:46.417335 4775 scope.go:117] "RemoveContainer" containerID="08ae9501ac1fff3f419167d6edfc25dee5cdf552c83a476747e9b00b7d12b20e" Dec 16 15:10:46 crc kubenswrapper[4775]: E1216 15:10:46.417936 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08ae9501ac1fff3f419167d6edfc25dee5cdf552c83a476747e9b00b7d12b20e\": container with ID starting with 08ae9501ac1fff3f419167d6edfc25dee5cdf552c83a476747e9b00b7d12b20e not found: ID does not exist" containerID="08ae9501ac1fff3f419167d6edfc25dee5cdf552c83a476747e9b00b7d12b20e" Dec 16 15:10:46 crc kubenswrapper[4775]: I1216 15:10:46.417981 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ae9501ac1fff3f419167d6edfc25dee5cdf552c83a476747e9b00b7d12b20e"} err="failed to get container status \"08ae9501ac1fff3f419167d6edfc25dee5cdf552c83a476747e9b00b7d12b20e\": rpc error: code = NotFound desc = could not find container \"08ae9501ac1fff3f419167d6edfc25dee5cdf552c83a476747e9b00b7d12b20e\": container with ID starting with 08ae9501ac1fff3f419167d6edfc25dee5cdf552c83a476747e9b00b7d12b20e not found: ID does not exist" Dec 16 15:10:46 crc kubenswrapper[4775]: I1216 15:10:46.438017 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wvn99"] Dec 16 15:10:46 crc kubenswrapper[4775]: I1216 15:10:46.445550 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-wvn99"] Dec 16 15:10:47 crc kubenswrapper[4775]: I1216 15:10:47.345875 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87ae0f39-1000-4f35-a953-7060b7286b71" path="/var/lib/kubelet/pods/87ae0f39-1000-4f35-a953-7060b7286b71/volumes" Dec 16 15:10:54 crc kubenswrapper[4775]: I1216 15:10:54.867664 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-nxcc2" Dec 16 15:10:54 crc kubenswrapper[4775]: I1216 15:10:54.868285 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-nxcc2" Dec 16 15:10:54 crc kubenswrapper[4775]: I1216 15:10:54.897827 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-nxcc2" Dec 16 15:10:55 crc kubenswrapper[4775]: I1216 15:10:55.487307 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-nxcc2" Dec 16 15:11:00 crc kubenswrapper[4775]: I1216 15:11:00.883379 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt"] Dec 16 15:11:00 crc kubenswrapper[4775]: E1216 15:11:00.884139 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87ae0f39-1000-4f35-a953-7060b7286b71" containerName="registry-server" Dec 16 15:11:00 crc kubenswrapper[4775]: I1216 15:11:00.884152 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ae0f39-1000-4f35-a953-7060b7286b71" containerName="registry-server" Dec 16 15:11:00 crc kubenswrapper[4775]: I1216 15:11:00.884269 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="87ae0f39-1000-4f35-a953-7060b7286b71" containerName="registry-server" Dec 16 15:11:00 crc kubenswrapper[4775]: I1216 15:11:00.885124 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt" Dec 16 15:11:00 crc kubenswrapper[4775]: I1216 15:11:00.887648 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vzkg8" Dec 16 15:11:00 crc kubenswrapper[4775]: I1216 15:11:00.898761 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt"] Dec 16 15:11:00 crc kubenswrapper[4775]: I1216 15:11:00.996429 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16deb1c1-d3c3-46d3-b565-30ef1773f202-util\") pod \"cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt\" (UID: \"16deb1c1-d3c3-46d3-b565-30ef1773f202\") " pod="openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt" Dec 16 15:11:00 crc kubenswrapper[4775]: I1216 15:11:00.996520 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16deb1c1-d3c3-46d3-b565-30ef1773f202-bundle\") pod \"cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt\" (UID: \"16deb1c1-d3c3-46d3-b565-30ef1773f202\") " pod="openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt" Dec 16 15:11:00 crc kubenswrapper[4775]: I1216 15:11:00.996570 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cj42\" (UniqueName: \"kubernetes.io/projected/16deb1c1-d3c3-46d3-b565-30ef1773f202-kube-api-access-5cj42\") pod \"cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt\" (UID: \"16deb1c1-d3c3-46d3-b565-30ef1773f202\") " pod="openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt" Dec 16 15:11:01 crc kubenswrapper[4775]: I1216 15:11:01.098239 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16deb1c1-d3c3-46d3-b565-30ef1773f202-util\") pod \"cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt\" (UID: \"16deb1c1-d3c3-46d3-b565-30ef1773f202\") " pod="openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt" Dec 16 15:11:01 crc kubenswrapper[4775]: I1216 15:11:01.098342 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16deb1c1-d3c3-46d3-b565-30ef1773f202-bundle\") pod \"cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt\" (UID: \"16deb1c1-d3c3-46d3-b565-30ef1773f202\") " pod="openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt" Dec 16 15:11:01 crc kubenswrapper[4775]: I1216 15:11:01.098429 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cj42\" (UniqueName: \"kubernetes.io/projected/16deb1c1-d3c3-46d3-b565-30ef1773f202-kube-api-access-5cj42\") pod \"cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt\" (UID: \"16deb1c1-d3c3-46d3-b565-30ef1773f202\") " pod="openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt" Dec 16 15:11:01 crc kubenswrapper[4775]: I1216 15:11:01.098945 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16deb1c1-d3c3-46d3-b565-30ef1773f202-util\") pod \"cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt\" (UID: \"16deb1c1-d3c3-46d3-b565-30ef1773f202\") " pod="openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt" Dec 16 15:11:01 crc kubenswrapper[4775]: I1216 15:11:01.099308 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16deb1c1-d3c3-46d3-b565-30ef1773f202-bundle\") pod \"cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt\" (UID: \"16deb1c1-d3c3-46d3-b565-30ef1773f202\") " pod="openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt" Dec 16 15:11:01 crc kubenswrapper[4775]: I1216 15:11:01.122753 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cj42\" (UniqueName: \"kubernetes.io/projected/16deb1c1-d3c3-46d3-b565-30ef1773f202-kube-api-access-5cj42\") pod \"cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt\" (UID: \"16deb1c1-d3c3-46d3-b565-30ef1773f202\") " pod="openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt" Dec 16 15:11:01 crc kubenswrapper[4775]: I1216 15:11:01.213693 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt" Dec 16 15:11:01 crc kubenswrapper[4775]: I1216 15:11:01.633721 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt"] Dec 16 15:11:02 crc kubenswrapper[4775]: I1216 15:11:02.493620 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt" event={"ID":"16deb1c1-d3c3-46d3-b565-30ef1773f202","Type":"ContainerStarted","Data":"ddafe21fc111ef1dbef47c27ef5ee14eaee3e9aacca73c9423cafa26ec26fc8c"} Dec 16 15:11:02 crc kubenswrapper[4775]: I1216 15:11:02.868808 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:11:02 crc kubenswrapper[4775]: I1216 15:11:02.869195 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:11:03 crc kubenswrapper[4775]: I1216 15:11:03.499688 4775 generic.go:334] "Generic (PLEG): container finished" podID="16deb1c1-d3c3-46d3-b565-30ef1773f202" containerID="29579be1e19c2b6f6051b9a4ebd194ac6930ccdceea99b7f16d8f04a697cc5b5" exitCode=0 Dec 16 15:11:03 crc kubenswrapper[4775]: I1216 15:11:03.499737 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt" event={"ID":"16deb1c1-d3c3-46d3-b565-30ef1773f202","Type":"ContainerDied","Data":"29579be1e19c2b6f6051b9a4ebd194ac6930ccdceea99b7f16d8f04a697cc5b5"} Dec 16 15:11:04 crc kubenswrapper[4775]: I1216 15:11:04.508831 4775 generic.go:334] "Generic (PLEG): container finished" podID="16deb1c1-d3c3-46d3-b565-30ef1773f202" containerID="295cde378faacf895c716a2cc2367388a73028558f240499b5721d2f5830749c" exitCode=0 Dec 16 15:11:04 crc kubenswrapper[4775]: I1216 15:11:04.508920 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt" event={"ID":"16deb1c1-d3c3-46d3-b565-30ef1773f202","Type":"ContainerDied","Data":"295cde378faacf895c716a2cc2367388a73028558f240499b5721d2f5830749c"} Dec 16 15:11:05 crc kubenswrapper[4775]: I1216 15:11:05.518699 4775 generic.go:334] "Generic (PLEG): container finished" podID="16deb1c1-d3c3-46d3-b565-30ef1773f202" containerID="58de4283d4917cf9e2944b43a79754e4be3cabe3111ab02c0c7dfcfb97dea742" exitCode=0 Dec 16 15:11:05 crc kubenswrapper[4775]: I1216 15:11:05.518747 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt" event={"ID":"16deb1c1-d3c3-46d3-b565-30ef1773f202","Type":"ContainerDied","Data":"58de4283d4917cf9e2944b43a79754e4be3cabe3111ab02c0c7dfcfb97dea742"} Dec 16 15:11:06 crc kubenswrapper[4775]: I1216 15:11:06.824197 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt" Dec 16 15:11:06 crc kubenswrapper[4775]: I1216 15:11:06.988833 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16deb1c1-d3c3-46d3-b565-30ef1773f202-bundle\") pod \"16deb1c1-d3c3-46d3-b565-30ef1773f202\" (UID: \"16deb1c1-d3c3-46d3-b565-30ef1773f202\") " Dec 16 15:11:06 crc kubenswrapper[4775]: I1216 15:11:06.989136 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cj42\" (UniqueName: \"kubernetes.io/projected/16deb1c1-d3c3-46d3-b565-30ef1773f202-kube-api-access-5cj42\") pod \"16deb1c1-d3c3-46d3-b565-30ef1773f202\" (UID: \"16deb1c1-d3c3-46d3-b565-30ef1773f202\") " Dec 16 15:11:06 crc kubenswrapper[4775]: I1216 15:11:06.989164 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16deb1c1-d3c3-46d3-b565-30ef1773f202-util\") pod \"16deb1c1-d3c3-46d3-b565-30ef1773f202\" (UID: \"16deb1c1-d3c3-46d3-b565-30ef1773f202\") " Dec 16 15:11:06 crc kubenswrapper[4775]: I1216 15:11:06.989858 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16deb1c1-d3c3-46d3-b565-30ef1773f202-bundle" (OuterVolumeSpecName: "bundle") pod "16deb1c1-d3c3-46d3-b565-30ef1773f202" (UID: "16deb1c1-d3c3-46d3-b565-30ef1773f202"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:11:06 crc kubenswrapper[4775]: I1216 15:11:06.995366 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16deb1c1-d3c3-46d3-b565-30ef1773f202-kube-api-access-5cj42" (OuterVolumeSpecName: "kube-api-access-5cj42") pod "16deb1c1-d3c3-46d3-b565-30ef1773f202" (UID: "16deb1c1-d3c3-46d3-b565-30ef1773f202"). InnerVolumeSpecName "kube-api-access-5cj42". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:11:07 crc kubenswrapper[4775]: I1216 15:11:07.011780 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16deb1c1-d3c3-46d3-b565-30ef1773f202-util" (OuterVolumeSpecName: "util") pod "16deb1c1-d3c3-46d3-b565-30ef1773f202" (UID: "16deb1c1-d3c3-46d3-b565-30ef1773f202"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:11:07 crc kubenswrapper[4775]: I1216 15:11:07.091136 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cj42\" (UniqueName: \"kubernetes.io/projected/16deb1c1-d3c3-46d3-b565-30ef1773f202-kube-api-access-5cj42\") on node \"crc\" DevicePath \"\"" Dec 16 15:11:07 crc kubenswrapper[4775]: I1216 15:11:07.091175 4775 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16deb1c1-d3c3-46d3-b565-30ef1773f202-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:11:07 crc kubenswrapper[4775]: I1216 15:11:07.091184 4775 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16deb1c1-d3c3-46d3-b565-30ef1773f202-util\") on node \"crc\" DevicePath \"\"" Dec 16 15:11:07 crc kubenswrapper[4775]: I1216 15:11:07.537953 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt" event={"ID":"16deb1c1-d3c3-46d3-b565-30ef1773f202","Type":"ContainerDied","Data":"ddafe21fc111ef1dbef47c27ef5ee14eaee3e9aacca73c9423cafa26ec26fc8c"} Dec 16 15:11:07 crc kubenswrapper[4775]: I1216 15:11:07.538006 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt" Dec 16 15:11:07 crc kubenswrapper[4775]: I1216 15:11:07.538014 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddafe21fc111ef1dbef47c27ef5ee14eaee3e9aacca73c9423cafa26ec26fc8c" Dec 16 15:11:12 crc kubenswrapper[4775]: I1216 15:11:12.891004 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6d499bd55-lnqxb"] Dec 16 15:11:12 crc kubenswrapper[4775]: E1216 15:11:12.891742 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16deb1c1-d3c3-46d3-b565-30ef1773f202" containerName="util" Dec 16 15:11:12 crc kubenswrapper[4775]: I1216 15:11:12.891755 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="16deb1c1-d3c3-46d3-b565-30ef1773f202" containerName="util" Dec 16 15:11:12 crc kubenswrapper[4775]: E1216 15:11:12.891781 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16deb1c1-d3c3-46d3-b565-30ef1773f202" containerName="extract" Dec 16 15:11:12 crc kubenswrapper[4775]: I1216 15:11:12.891787 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="16deb1c1-d3c3-46d3-b565-30ef1773f202" containerName="extract" Dec 16 15:11:12 crc kubenswrapper[4775]: E1216 15:11:12.891800 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16deb1c1-d3c3-46d3-b565-30ef1773f202" containerName="pull" Dec 16 15:11:12 crc kubenswrapper[4775]: I1216 15:11:12.891807 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="16deb1c1-d3c3-46d3-b565-30ef1773f202" containerName="pull" Dec 16 15:11:12 crc kubenswrapper[4775]: I1216 15:11:12.891947 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="16deb1c1-d3c3-46d3-b565-30ef1773f202" containerName="extract" Dec 16 15:11:12 crc kubenswrapper[4775]: I1216 15:11:12.892420 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6d499bd55-lnqxb" Dec 16 15:11:12 crc kubenswrapper[4775]: I1216 15:11:12.894733 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-t9bf5" Dec 16 15:11:12 crc kubenswrapper[4775]: I1216 15:11:12.926327 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6d499bd55-lnqxb"] Dec 16 15:11:12 crc kubenswrapper[4775]: I1216 15:11:12.972717 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df5rs\" (UniqueName: \"kubernetes.io/projected/480fe07d-8bd7-4879-bb80-ceb5f0baf2cb-kube-api-access-df5rs\") pod \"openstack-operator-controller-operator-6d499bd55-lnqxb\" (UID: \"480fe07d-8bd7-4879-bb80-ceb5f0baf2cb\") " pod="openstack-operators/openstack-operator-controller-operator-6d499bd55-lnqxb" Dec 16 15:11:13 crc kubenswrapper[4775]: I1216 15:11:13.073608 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df5rs\" (UniqueName: \"kubernetes.io/projected/480fe07d-8bd7-4879-bb80-ceb5f0baf2cb-kube-api-access-df5rs\") pod \"openstack-operator-controller-operator-6d499bd55-lnqxb\" (UID: \"480fe07d-8bd7-4879-bb80-ceb5f0baf2cb\") " pod="openstack-operators/openstack-operator-controller-operator-6d499bd55-lnqxb" Dec 16 15:11:13 crc kubenswrapper[4775]: I1216 15:11:13.092199 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df5rs\" (UniqueName: \"kubernetes.io/projected/480fe07d-8bd7-4879-bb80-ceb5f0baf2cb-kube-api-access-df5rs\") pod \"openstack-operator-controller-operator-6d499bd55-lnqxb\" (UID: \"480fe07d-8bd7-4879-bb80-ceb5f0baf2cb\") " pod="openstack-operators/openstack-operator-controller-operator-6d499bd55-lnqxb" Dec 16 15:11:13 crc kubenswrapper[4775]: I1216 15:11:13.208402 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6d499bd55-lnqxb" Dec 16 15:11:13 crc kubenswrapper[4775]: I1216 15:11:13.670902 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6d499bd55-lnqxb"] Dec 16 15:11:13 crc kubenswrapper[4775]: W1216 15:11:13.682551 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod480fe07d_8bd7_4879_bb80_ceb5f0baf2cb.slice/crio-919337f9722b1d30325678865fd925972df8268e5cef11a21046bf2aec882d60 WatchSource:0}: Error finding container 919337f9722b1d30325678865fd925972df8268e5cef11a21046bf2aec882d60: Status 404 returned error can't find the container with id 919337f9722b1d30325678865fd925972df8268e5cef11a21046bf2aec882d60 Dec 16 15:11:14 crc kubenswrapper[4775]: I1216 15:11:14.701641 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6d499bd55-lnqxb" event={"ID":"480fe07d-8bd7-4879-bb80-ceb5f0baf2cb","Type":"ContainerStarted","Data":"919337f9722b1d30325678865fd925972df8268e5cef11a21046bf2aec882d60"} Dec 16 15:11:20 crc kubenswrapper[4775]: I1216 15:11:20.862620 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6d499bd55-lnqxb" event={"ID":"480fe07d-8bd7-4879-bb80-ceb5f0baf2cb","Type":"ContainerStarted","Data":"1031aca9e792d341f40daf2dbc1e7734d7a0b2eb78403a874f67f448401568ef"} Dec 16 15:11:20 crc kubenswrapper[4775]: I1216 15:11:20.863107 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6d499bd55-lnqxb" Dec 16 15:11:20 crc kubenswrapper[4775]: I1216 15:11:20.889006 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6d499bd55-lnqxb" podStartSLOduration=1.952831191 podStartE2EDuration="8.888986821s" podCreationTimestamp="2025-12-16 15:11:12 +0000 UTC" firstStartedPulling="2025-12-16 15:11:13.685353834 +0000 UTC m=+998.636432757" lastFinishedPulling="2025-12-16 15:11:20.621509464 +0000 UTC m=+1005.572588387" observedRunningTime="2025-12-16 15:11:20.886084878 +0000 UTC m=+1005.837163811" watchObservedRunningTime="2025-12-16 15:11:20.888986821 +0000 UTC m=+1005.840065744" Dec 16 15:11:32 crc kubenswrapper[4775]: I1216 15:11:32.869436 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:11:32 crc kubenswrapper[4775]: I1216 15:11:32.870339 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:11:33 crc kubenswrapper[4775]: I1216 15:11:33.212497 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6d499bd55-lnqxb" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.473127 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-95949466-9m4t8"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.474683 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-95949466-9m4t8" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.479329 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-rqb4j" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.480034 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5f98b4754f-5gxtk"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.480758 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-5gxtk" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.482820 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-4dbkm" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.488313 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-95949466-9m4t8"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.493385 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5f98b4754f-5gxtk"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.578268 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-dfdrn"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.579188 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-dfdrn" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.582438 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-gmvt9" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.587182 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-dfdrn"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.595919 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-767f9d7567-dqpmk"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.596862 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-dqpmk" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.600654 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-8t447" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.607762 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-767f9d7567-dqpmk"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.612591 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5559d9665f-4hmbr"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.613496 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5559d9665f-4hmbr" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.615441 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-qtlxm" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.621285 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccf486b9-pk5fg"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.629988 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-pk5fg" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.633497 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-w6t5c" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.640951 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5559d9665f-4hmbr"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.642286 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68hc7\" (UniqueName: \"kubernetes.io/projected/03a9286d-3fd3-4ec6-9a1d-fb8d613f401e-kube-api-access-68hc7\") pod \"barbican-operator-controller-manager-95949466-9m4t8\" (UID: \"03a9286d-3fd3-4ec6-9a1d-fb8d613f401e\") " pod="openstack-operators/barbican-operator-controller-manager-95949466-9m4t8" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.642353 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99c2t\" (UniqueName: \"kubernetes.io/projected/e002ee65-47de-44d4-864e-531283c322f7-kube-api-access-99c2t\") pod \"cinder-operator-controller-manager-5f98b4754f-5gxtk\" (UID: \"e002ee65-47de-44d4-864e-531283c322f7\") " pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-5gxtk" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.652552 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccf486b9-pk5fg"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.663801 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.665198 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.668512 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-fpzp4" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.668749 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.692166 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.701419 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f458558d7-9b8tb"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.703011 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-9b8tb" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.712201 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-gw5nx" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.715333 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f458558d7-9b8tb"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.736389 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fdd9786f7-gcfkm"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.737333 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-gcfkm" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.740711 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-2l5rc" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.743732 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj5tp\" (UniqueName: \"kubernetes.io/projected/d0dab2aa-577b-4a9d-bcce-0530cbb3e4b6-kube-api-access-sj5tp\") pod \"designate-operator-controller-manager-66f8b87655-dfdrn\" (UID: \"d0dab2aa-577b-4a9d-bcce-0530cbb3e4b6\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-dfdrn" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.743851 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnv78\" (UniqueName: \"kubernetes.io/projected/a738c781-0876-490f-bf95-d7d77a6f2aff-kube-api-access-mnv78\") pod \"horizon-operator-controller-manager-6ccf486b9-pk5fg\" (UID: \"a738c781-0876-490f-bf95-d7d77a6f2aff\") " pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-pk5fg" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.743879 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nvvr\" (UniqueName: \"kubernetes.io/projected/8e002a19-f4ca-4186-940c-321834e88e5e-kube-api-access-5nvvr\") pod \"glance-operator-controller-manager-767f9d7567-dqpmk\" (UID: \"8e002a19-f4ca-4186-940c-321834e88e5e\") " pod="openstack-operators/glance-operator-controller-manager-767f9d7567-dqpmk" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.743937 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68hc7\" (UniqueName: \"kubernetes.io/projected/03a9286d-3fd3-4ec6-9a1d-fb8d613f401e-kube-api-access-68hc7\") pod \"barbican-operator-controller-manager-95949466-9m4t8\" (UID: \"03a9286d-3fd3-4ec6-9a1d-fb8d613f401e\") " pod="openstack-operators/barbican-operator-controller-manager-95949466-9m4t8" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.743975 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv54p\" (UniqueName: \"kubernetes.io/projected/c5962fcc-3c3b-435a-b848-237af19ce258-kube-api-access-nv54p\") pod \"heat-operator-controller-manager-5559d9665f-4hmbr\" (UID: \"c5962fcc-3c3b-435a-b848-237af19ce258\") " pod="openstack-operators/heat-operator-controller-manager-5559d9665f-4hmbr" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.744014 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99c2t\" (UniqueName: \"kubernetes.io/projected/e002ee65-47de-44d4-864e-531283c322f7-kube-api-access-99c2t\") pod \"cinder-operator-controller-manager-5f98b4754f-5gxtk\" (UID: \"e002ee65-47de-44d4-864e-531283c322f7\") " pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-5gxtk" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.755219 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c7cbf548f-lph76"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.756404 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-lph76" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.758444 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-ldx6x" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.764660 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c7cbf548f-lph76"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.771505 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fdd9786f7-gcfkm"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.777071 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f76f4954c-47s9s"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.784948 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-47s9s" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.786102 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99c2t\" (UniqueName: \"kubernetes.io/projected/e002ee65-47de-44d4-864e-531283c322f7-kube-api-access-99c2t\") pod \"cinder-operator-controller-manager-5f98b4754f-5gxtk\" (UID: \"e002ee65-47de-44d4-864e-531283c322f7\") " pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-5gxtk" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.794863 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-4cvqk" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.796958 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68hc7\" (UniqueName: \"kubernetes.io/projected/03a9286d-3fd3-4ec6-9a1d-fb8d613f401e-kube-api-access-68hc7\") pod \"barbican-operator-controller-manager-95949466-9m4t8\" (UID: \"03a9286d-3fd3-4ec6-9a1d-fb8d613f401e\") " pod="openstack-operators/barbican-operator-controller-manager-95949466-9m4t8" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.803743 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-95949466-9m4t8" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.822382 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-5gxtk" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.844775 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79rm7\" (UniqueName: \"kubernetes.io/projected/d8873d69-8f0e-4816-b39e-bf8506282196-kube-api-access-79rm7\") pod \"ironic-operator-controller-manager-f458558d7-9b8tb\" (UID: \"d8873d69-8f0e-4816-b39e-bf8506282196\") " pod="openstack-operators/ironic-operator-controller-manager-f458558d7-9b8tb" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.853308 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl4ql\" (UniqueName: \"kubernetes.io/projected/be824423-7753-4920-8aa7-93d2904280fb-kube-api-access-rl4ql\") pod \"infra-operator-controller-manager-6558fdd56c-jc4nj\" (UID: \"be824423-7753-4920-8aa7-93d2904280fb\") " pod="openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.853511 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be824423-7753-4920-8aa7-93d2904280fb-cert\") pod \"infra-operator-controller-manager-6558fdd56c-jc4nj\" (UID: \"be824423-7753-4920-8aa7-93d2904280fb\") " pod="openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.853811 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2hxc\" (UniqueName: \"kubernetes.io/projected/eff249fe-7aa9-406b-a4f0-91d7891afc8b-kube-api-access-r2hxc\") pod \"keystone-operator-controller-manager-5c7cbf548f-lph76\" (UID: \"eff249fe-7aa9-406b-a4f0-91d7891afc8b\") " pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-lph76" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.846967 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-nvb99"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.855029 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj5tp\" (UniqueName: \"kubernetes.io/projected/d0dab2aa-577b-4a9d-bcce-0530cbb3e4b6-kube-api-access-sj5tp\") pod \"designate-operator-controller-manager-66f8b87655-dfdrn\" (UID: \"d0dab2aa-577b-4a9d-bcce-0530cbb3e4b6\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-dfdrn" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.855224 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnv78\" (UniqueName: \"kubernetes.io/projected/a738c781-0876-490f-bf95-d7d77a6f2aff-kube-api-access-mnv78\") pod \"horizon-operator-controller-manager-6ccf486b9-pk5fg\" (UID: \"a738c781-0876-490f-bf95-d7d77a6f2aff\") " pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-pk5fg" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.855257 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nvvr\" (UniqueName: \"kubernetes.io/projected/8e002a19-f4ca-4186-940c-321834e88e5e-kube-api-access-5nvvr\") pod \"glance-operator-controller-manager-767f9d7567-dqpmk\" (UID: \"8e002a19-f4ca-4186-940c-321834e88e5e\") " pod="openstack-operators/glance-operator-controller-manager-767f9d7567-dqpmk" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.855295 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv54p\" (UniqueName: \"kubernetes.io/projected/c5962fcc-3c3b-435a-b848-237af19ce258-kube-api-access-nv54p\") pod \"heat-operator-controller-manager-5559d9665f-4hmbr\" (UID: \"c5962fcc-3c3b-435a-b848-237af19ce258\") " pod="openstack-operators/heat-operator-controller-manager-5559d9665f-4hmbr" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.855348 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq4w4\" (UniqueName: \"kubernetes.io/projected/f82f14a2-7460-4a06-978b-d22d9ad7d6bd-kube-api-access-kq4w4\") pod \"manila-operator-controller-manager-5fdd9786f7-gcfkm\" (UID: \"f82f14a2-7460-4a06-978b-d22d9ad7d6bd\") " pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-gcfkm" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.855573 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-nvb99" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.872403 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-qmqgx"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.873338 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-qmqgx" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.887572 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-wcjhq" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.888004 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-8r4dz" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.913293 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj5tp\" (UniqueName: \"kubernetes.io/projected/d0dab2aa-577b-4a9d-bcce-0530cbb3e4b6-kube-api-access-sj5tp\") pod \"designate-operator-controller-manager-66f8b87655-dfdrn\" (UID: \"d0dab2aa-577b-4a9d-bcce-0530cbb3e4b6\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-dfdrn" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.920141 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnv78\" (UniqueName: \"kubernetes.io/projected/a738c781-0876-490f-bf95-d7d77a6f2aff-kube-api-access-mnv78\") pod \"horizon-operator-controller-manager-6ccf486b9-pk5fg\" (UID: \"a738c781-0876-490f-bf95-d7d77a6f2aff\") " pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-pk5fg" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.928554 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv54p\" (UniqueName: \"kubernetes.io/projected/c5962fcc-3c3b-435a-b848-237af19ce258-kube-api-access-nv54p\") pod \"heat-operator-controller-manager-5559d9665f-4hmbr\" (UID: \"c5962fcc-3c3b-435a-b848-237af19ce258\") " pod="openstack-operators/heat-operator-controller-manager-5559d9665f-4hmbr" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.934263 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-dfdrn" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.937052 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nvvr\" (UniqueName: \"kubernetes.io/projected/8e002a19-f4ca-4186-940c-321834e88e5e-kube-api-access-5nvvr\") pod \"glance-operator-controller-manager-767f9d7567-dqpmk\" (UID: \"8e002a19-f4ca-4186-940c-321834e88e5e\") " pod="openstack-operators/glance-operator-controller-manager-767f9d7567-dqpmk" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.951853 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f76f4954c-47s9s"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.957656 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-nj5th"] Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.958443 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-nj5th" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.958780 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq4w4\" (UniqueName: \"kubernetes.io/projected/f82f14a2-7460-4a06-978b-d22d9ad7d6bd-kube-api-access-kq4w4\") pod \"manila-operator-controller-manager-5fdd9786f7-gcfkm\" (UID: \"f82f14a2-7460-4a06-978b-d22d9ad7d6bd\") " pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-gcfkm" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.958854 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79rm7\" (UniqueName: \"kubernetes.io/projected/d8873d69-8f0e-4816-b39e-bf8506282196-kube-api-access-79rm7\") pod \"ironic-operator-controller-manager-f458558d7-9b8tb\" (UID: \"d8873d69-8f0e-4816-b39e-bf8506282196\") " pod="openstack-operators/ironic-operator-controller-manager-f458558d7-9b8tb" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.958901 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl4ql\" (UniqueName: \"kubernetes.io/projected/be824423-7753-4920-8aa7-93d2904280fb-kube-api-access-rl4ql\") pod \"infra-operator-controller-manager-6558fdd56c-jc4nj\" (UID: \"be824423-7753-4920-8aa7-93d2904280fb\") " pod="openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.958944 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be824423-7753-4920-8aa7-93d2904280fb-cert\") pod \"infra-operator-controller-manager-6558fdd56c-jc4nj\" (UID: \"be824423-7753-4920-8aa7-93d2904280fb\") " pod="openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.958992 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mths\" (UniqueName: \"kubernetes.io/projected/4fbf17e0-d42f-463b-9f01-a39d842812ff-kube-api-access-8mths\") pod \"mariadb-operator-controller-manager-f76f4954c-47s9s\" (UID: \"4fbf17e0-d42f-463b-9f01-a39d842812ff\") " pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-47s9s" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.959024 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2hxc\" (UniqueName: \"kubernetes.io/projected/eff249fe-7aa9-406b-a4f0-91d7891afc8b-kube-api-access-r2hxc\") pod \"keystone-operator-controller-manager-5c7cbf548f-lph76\" (UID: \"eff249fe-7aa9-406b-a4f0-91d7891afc8b\") " pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-lph76" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.959063 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59kjv\" (UniqueName: \"kubernetes.io/projected/63c035e4-8ff2-49a4-94d9-57c65a71494b-kube-api-access-59kjv\") pod \"nova-operator-controller-manager-5fbbf8b6cc-nvb99\" (UID: \"63c035e4-8ff2-49a4-94d9-57c65a71494b\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-nvb99" Dec 16 15:12:01 crc kubenswrapper[4775]: E1216 15:12:01.959169 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 15:12:01 crc kubenswrapper[4775]: E1216 15:12:01.959217 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be824423-7753-4920-8aa7-93d2904280fb-cert podName:be824423-7753-4920-8aa7-93d2904280fb nodeName:}" failed. No retries permitted until 2025-12-16 15:12:02.459197964 +0000 UTC m=+1047.410276887 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be824423-7753-4920-8aa7-93d2904280fb-cert") pod "infra-operator-controller-manager-6558fdd56c-jc4nj" (UID: "be824423-7753-4920-8aa7-93d2904280fb") : secret "infra-operator-webhook-server-cert" not found Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.962065 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-w9lkw" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.965074 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-dqpmk" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.981635 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl4ql\" (UniqueName: \"kubernetes.io/projected/be824423-7753-4920-8aa7-93d2904280fb-kube-api-access-rl4ql\") pod \"infra-operator-controller-manager-6558fdd56c-jc4nj\" (UID: \"be824423-7753-4920-8aa7-93d2904280fb\") " pod="openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.993677 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79rm7\" (UniqueName: \"kubernetes.io/projected/d8873d69-8f0e-4816-b39e-bf8506282196-kube-api-access-79rm7\") pod \"ironic-operator-controller-manager-f458558d7-9b8tb\" (UID: \"d8873d69-8f0e-4816-b39e-bf8506282196\") " pod="openstack-operators/ironic-operator-controller-manager-f458558d7-9b8tb" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.993955 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq4w4\" (UniqueName: \"kubernetes.io/projected/f82f14a2-7460-4a06-978b-d22d9ad7d6bd-kube-api-access-kq4w4\") pod \"manila-operator-controller-manager-5fdd9786f7-gcfkm\" (UID: \"f82f14a2-7460-4a06-978b-d22d9ad7d6bd\") " pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-gcfkm" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.994349 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2hxc\" (UniqueName: \"kubernetes.io/projected/eff249fe-7aa9-406b-a4f0-91d7891afc8b-kube-api-access-r2hxc\") pod \"keystone-operator-controller-manager-5c7cbf548f-lph76\" (UID: \"eff249fe-7aa9-406b-a4f0-91d7891afc8b\") " pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-lph76" Dec 16 15:12:01 crc kubenswrapper[4775]: I1216 15:12:01.996536 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5559d9665f-4hmbr" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.004954 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-nvb99"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.015011 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-qmqgx"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.021664 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-pk5fg" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.029780 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-nj5th"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.051036 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.051767 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.051795 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-d9rg9"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.052433 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-d9rg9" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.054206 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.056553 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.057540 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-jrr99" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.057739 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-chvrl" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.082679 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59kjv\" (UniqueName: \"kubernetes.io/projected/63c035e4-8ff2-49a4-94d9-57c65a71494b-kube-api-access-59kjv\") pod \"nova-operator-controller-manager-5fbbf8b6cc-nvb99\" (UID: \"63c035e4-8ff2-49a4-94d9-57c65a71494b\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-nvb99" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.082742 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrlpl\" (UniqueName: \"kubernetes.io/projected/1723eb19-5ef2-43d0-a1f8-590e89eb5f87-kube-api-access-jrlpl\") pod \"neutron-operator-controller-manager-7cd87b778f-qmqgx\" (UID: \"1723eb19-5ef2-43d0-a1f8-590e89eb5f87\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-qmqgx" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.082846 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mths\" (UniqueName: \"kubernetes.io/projected/4fbf17e0-d42f-463b-9f01-a39d842812ff-kube-api-access-8mths\") pod \"mariadb-operator-controller-manager-f76f4954c-47s9s\" (UID: \"4fbf17e0-d42f-463b-9f01-a39d842812ff\") " pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-47s9s" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.082870 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75r2d\" (UniqueName: \"kubernetes.io/projected/11012716-6e3c-4b17-97c7-16e723ad1092-kube-api-access-75r2d\") pod \"octavia-operator-controller-manager-68c649d9d-nj5th\" (UID: \"11012716-6e3c-4b17-97c7-16e723ad1092\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-nj5th" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.095128 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-9b8tb" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.174068 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-gcfkm" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.185362 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxwsr\" (UniqueName: \"kubernetes.io/projected/b70a54b3-3bc0-45e4-add9-d47b81371266-kube-api-access-bxwsr\") pod \"ovn-operator-controller-manager-bf6d4f946-d9rg9\" (UID: \"b70a54b3-3bc0-45e4-add9-d47b81371266\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-d9rg9" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.190760 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8sgd\" (UniqueName: \"kubernetes.io/projected/275767d8-4eed-4a90-8d43-348c607ee37e-kube-api-access-q8sgd\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb\" (UID: \"275767d8-4eed-4a90-8d43-348c607ee37e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.225035 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75r2d\" (UniqueName: \"kubernetes.io/projected/11012716-6e3c-4b17-97c7-16e723ad1092-kube-api-access-75r2d\") pod \"octavia-operator-controller-manager-68c649d9d-nj5th\" (UID: \"11012716-6e3c-4b17-97c7-16e723ad1092\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-nj5th" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.225170 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/275767d8-4eed-4a90-8d43-348c607ee37e-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb\" (UID: \"275767d8-4eed-4a90-8d43-348c607ee37e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.225207 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrlpl\" (UniqueName: \"kubernetes.io/projected/1723eb19-5ef2-43d0-a1f8-590e89eb5f87-kube-api-access-jrlpl\") pod \"neutron-operator-controller-manager-7cd87b778f-qmqgx\" (UID: \"1723eb19-5ef2-43d0-a1f8-590e89eb5f87\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-qmqgx" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.252955 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8665b56d78-7fmnw"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.252983 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59kjv\" (UniqueName: \"kubernetes.io/projected/63c035e4-8ff2-49a4-94d9-57c65a71494b-kube-api-access-59kjv\") pod \"nova-operator-controller-manager-5fbbf8b6cc-nvb99\" (UID: \"63c035e4-8ff2-49a4-94d9-57c65a71494b\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-nvb99" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.254733 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-d9rg9"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.256218 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-7fmnw" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.263071 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-6lddp" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.264154 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75r2d\" (UniqueName: \"kubernetes.io/projected/11012716-6e3c-4b17-97c7-16e723ad1092-kube-api-access-75r2d\") pod \"octavia-operator-controller-manager-68c649d9d-nj5th\" (UID: \"11012716-6e3c-4b17-97c7-16e723ad1092\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-nj5th" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.267201 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrlpl\" (UniqueName: \"kubernetes.io/projected/1723eb19-5ef2-43d0-a1f8-590e89eb5f87-kube-api-access-jrlpl\") pod \"neutron-operator-controller-manager-7cd87b778f-qmqgx\" (UID: \"1723eb19-5ef2-43d0-a1f8-590e89eb5f87\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-qmqgx" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.267770 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-lph76" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.278027 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c6df8f9-r8p6v"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.279538 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-r8p6v" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.281523 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mths\" (UniqueName: \"kubernetes.io/projected/4fbf17e0-d42f-463b-9f01-a39d842812ff-kube-api-access-8mths\") pod \"mariadb-operator-controller-manager-f76f4954c-47s9s\" (UID: \"4fbf17e0-d42f-463b-9f01-a39d842812ff\") " pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-47s9s" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.300537 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-vhhws" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.300866 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8665b56d78-7fmnw"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.301943 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-47s9s" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.316088 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c6df8f9-r8p6v"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.316918 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-nvb99" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.327480 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/275767d8-4eed-4a90-8d43-348c607ee37e-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb\" (UID: \"275767d8-4eed-4a90-8d43-348c607ee37e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.327585 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxwsr\" (UniqueName: \"kubernetes.io/projected/b70a54b3-3bc0-45e4-add9-d47b81371266-kube-api-access-bxwsr\") pod \"ovn-operator-controller-manager-bf6d4f946-d9rg9\" (UID: \"b70a54b3-3bc0-45e4-add9-d47b81371266\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-d9rg9" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.327665 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8sgd\" (UniqueName: \"kubernetes.io/projected/275767d8-4eed-4a90-8d43-348c607ee37e-kube-api-access-q8sgd\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb\" (UID: \"275767d8-4eed-4a90-8d43-348c607ee37e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.327865 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-97d456b9-d2kbz"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.329109 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-d2kbz" Dec 16 15:12:02 crc kubenswrapper[4775]: E1216 15:12:02.329842 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:12:02 crc kubenswrapper[4775]: E1216 15:12:02.330018 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275767d8-4eed-4a90-8d43-348c607ee37e-cert podName:275767d8-4eed-4a90-8d43-348c607ee37e nodeName:}" failed. No retries permitted until 2025-12-16 15:12:02.829971116 +0000 UTC m=+1047.781050039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/275767d8-4eed-4a90-8d43-348c607ee37e-cert") pod "openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" (UID: "275767d8-4eed-4a90-8d43-348c607ee37e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.330879 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-qmqgx" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.335229 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-97d456b9-d2kbz"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.335589 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-nj5th" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.352146 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-756ccf86c7-4mncx"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.353793 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-4mncx" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.365738 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-756ccf86c7-4mncx"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.379703 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55f78b7c4c-bl86c"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.380844 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-bl86c" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.385089 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55f78b7c4c-bl86c"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.397071 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.398498 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.407118 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc"] Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.429816 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp2dq\" (UniqueName: \"kubernetes.io/projected/19d1c138-c230-44b2-972c-c557693054f5-kube-api-access-sp2dq\") pod \"placement-operator-controller-manager-8665b56d78-7fmnw\" (UID: \"19d1c138-c230-44b2-972c-c557693054f5\") " pod="openstack-operators/placement-operator-controller-manager-8665b56d78-7fmnw" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.430060 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gscnd\" (UniqueName: \"kubernetes.io/projected/2f9a8b75-2e17-43ce-be88-dbc6f7ec0cb1-kube-api-access-gscnd\") pod \"telemetry-operator-controller-manager-97d456b9-d2kbz\" (UID: \"2f9a8b75-2e17-43ce-be88-dbc6f7ec0cb1\") " pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-d2kbz" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.430157 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sxqs\" (UniqueName: \"kubernetes.io/projected/85cc53cf-83a7-4810-b0fc-7317f9327c09-kube-api-access-4sxqs\") pod \"swift-operator-controller-manager-5c6df8f9-r8p6v\" (UID: \"85cc53cf-83a7-4810-b0fc-7317f9327c09\") " pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-r8p6v" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.443190 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-thl4r" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.455610 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-c4bg8" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.456258 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-wg5jk" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.456999 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxwsr\" (UniqueName: \"kubernetes.io/projected/b70a54b3-3bc0-45e4-add9-d47b81371266-kube-api-access-bxwsr\") pod \"ovn-operator-controller-manager-bf6d4f946-d9rg9\" (UID: \"b70a54b3-3bc0-45e4-add9-d47b81371266\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-d9rg9" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.457471 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.457617 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.457754 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rrqlx" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.528077 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8sgd\" (UniqueName: \"kubernetes.io/projected/275767d8-4eed-4a90-8d43-348c607ee37e-kube-api-access-q8sgd\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb\" (UID: \"275767d8-4eed-4a90-8d43-348c607ee37e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.536229 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp2dq\" (UniqueName: \"kubernetes.io/projected/19d1c138-c230-44b2-972c-c557693054f5-kube-api-access-sp2dq\") pod \"placement-operator-controller-manager-8665b56d78-7fmnw\" (UID: \"19d1c138-c230-44b2-972c-c557693054f5\") " pod="openstack-operators/placement-operator-controller-manager-8665b56d78-7fmnw" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.536289 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gscnd\" (UniqueName: \"kubernetes.io/projected/2f9a8b75-2e17-43ce-be88-dbc6f7ec0cb1-kube-api-access-gscnd\") pod \"telemetry-operator-controller-manager-97d456b9-d2kbz\" (UID: \"2f9a8b75-2e17-43ce-be88-dbc6f7ec0cb1\") " pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-d2kbz" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.536320 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlrdl\" (UniqueName: \"kubernetes.io/projected/bd6aff58-984e-4106-acb0-c689f6e31832-kube-api-access-zlrdl\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.536361 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmlpk\" (UniqueName: \"kubernetes.io/projected/f05c78d5-d86c-42de-9eee-e8d09204a0b4-kube-api-access-lmlpk\") pod \"test-operator-controller-manager-756ccf86c7-4mncx\" (UID: \"f05c78d5-d86c-42de-9eee-e8d09204a0b4\") " pod="openstack-operators/test-operator-controller-manager-756ccf86c7-4mncx" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.536393 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sxqs\" (UniqueName: \"kubernetes.io/projected/85cc53cf-83a7-4810-b0fc-7317f9327c09-kube-api-access-4sxqs\") pod \"swift-operator-controller-manager-5c6df8f9-r8p6v\" (UID: \"85cc53cf-83a7-4810-b0fc-7317f9327c09\") " pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-r8p6v" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.536415 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.536443 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be824423-7753-4920-8aa7-93d2904280fb-cert\") pod \"infra-operator-controller-manager-6558fdd56c-jc4nj\" (UID: \"be824423-7753-4920-8aa7-93d2904280fb\") " pod="openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.536471 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-metrics-certs\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.536518 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqncv\" (UniqueName: \"kubernetes.io/projected/14102b10-a3ba-4f16-9928-4f41426a435f-kube-api-access-hqncv\") pod \"watcher-operator-controller-manager-55f78b7c4c-bl86c\" (UID: \"14102b10-a3ba-4f16-9928-4f41426a435f\") " pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-bl86c" Dec 16 15:12:02 crc kubenswrapper[4775]: E1216 15:12:02.537198 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 15:12:02 crc kubenswrapper[4775]: E1216 15:12:02.537243 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be824423-7753-4920-8aa7-93d2904280fb-cert podName:be824423-7753-4920-8aa7-93d2904280fb nodeName:}" failed. No retries permitted until 2025-12-16 15:12:03.537225813 +0000 UTC m=+1048.488304736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be824423-7753-4920-8aa7-93d2904280fb-cert") pod "infra-operator-controller-manager-6558fdd56c-jc4nj" (UID: "be824423-7753-4920-8aa7-93d2904280fb") : secret "infra-operator-webhook-server-cert" not found Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.684469 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlrdl\" (UniqueName: \"kubernetes.io/projected/bd6aff58-984e-4106-acb0-c689f6e31832-kube-api-access-zlrdl\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.684747 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmlpk\" (UniqueName: \"kubernetes.io/projected/f05c78d5-d86c-42de-9eee-e8d09204a0b4-kube-api-access-lmlpk\") pod \"test-operator-controller-manager-756ccf86c7-4mncx\" (UID: \"f05c78d5-d86c-42de-9eee-e8d09204a0b4\") " pod="openstack-operators/test-operator-controller-manager-756ccf86c7-4mncx" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.684769 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.684807 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-metrics-certs\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:02 crc kubenswrapper[4775]: I1216 15:12:02.684844 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqncv\" (UniqueName: \"kubernetes.io/projected/14102b10-a3ba-4f16-9928-4f41426a435f-kube-api-access-hqncv\") pod \"watcher-operator-controller-manager-55f78b7c4c-bl86c\" (UID: \"14102b10-a3ba-4f16-9928-4f41426a435f\") " pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-bl86c" Dec 16 15:12:02 crc kubenswrapper[4775]: E1216 15:12:02.694129 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 15:12:02 crc kubenswrapper[4775]: E1216 15:12:02.694198 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs podName:bd6aff58-984e-4106-acb0-c689f6e31832 nodeName:}" failed. No retries permitted until 2025-12-16 15:12:03.194178016 +0000 UTC m=+1048.145256939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs") pod "openstack-operator-controller-manager-7bc9b98d8-rvdbc" (UID: "bd6aff58-984e-4106-acb0-c689f6e31832") : secret "webhook-server-cert" not found Dec 16 15:12:02 crc kubenswrapper[4775]: E1216 15:12:02.694389 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 15:12:02 crc kubenswrapper[4775]: E1216 15:12:02.694418 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-metrics-certs podName:bd6aff58-984e-4106-acb0-c689f6e31832 nodeName:}" failed. No retries permitted until 2025-12-16 15:12:03.194411043 +0000 UTC m=+1048.145489966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-metrics-certs") pod "openstack-operator-controller-manager-7bc9b98d8-rvdbc" (UID: "bd6aff58-984e-4106-acb0-c689f6e31832") : secret "metrics-server-cert" not found Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.112987 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp2dq\" (UniqueName: \"kubernetes.io/projected/19d1c138-c230-44b2-972c-c557693054f5-kube-api-access-sp2dq\") pod \"placement-operator-controller-manager-8665b56d78-7fmnw\" (UID: \"19d1c138-c230-44b2-972c-c557693054f5\") " pod="openstack-operators/placement-operator-controller-manager-8665b56d78-7fmnw" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.114822 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmlpk\" (UniqueName: \"kubernetes.io/projected/f05c78d5-d86c-42de-9eee-e8d09204a0b4-kube-api-access-lmlpk\") pod \"test-operator-controller-manager-756ccf86c7-4mncx\" (UID: \"f05c78d5-d86c-42de-9eee-e8d09204a0b4\") " pod="openstack-operators/test-operator-controller-manager-756ccf86c7-4mncx" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.115098 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/275767d8-4eed-4a90-8d43-348c607ee37e-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb\" (UID: \"275767d8-4eed-4a90-8d43-348c607ee37e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" Dec 16 15:12:03 crc kubenswrapper[4775]: E1216 15:12:03.115317 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:12:03 crc kubenswrapper[4775]: E1216 15:12:03.115369 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275767d8-4eed-4a90-8d43-348c607ee37e-cert podName:275767d8-4eed-4a90-8d43-348c607ee37e nodeName:}" failed. No retries permitted until 2025-12-16 15:12:04.115351562 +0000 UTC m=+1049.066430485 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/275767d8-4eed-4a90-8d43-348c607ee37e-cert") pod "openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" (UID: "275767d8-4eed-4a90-8d43-348c607ee37e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.115385 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-d9rg9" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.115711 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-4mncx" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.135560 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sxqs\" (UniqueName: \"kubernetes.io/projected/85cc53cf-83a7-4810-b0fc-7317f9327c09-kube-api-access-4sxqs\") pod \"swift-operator-controller-manager-5c6df8f9-r8p6v\" (UID: \"85cc53cf-83a7-4810-b0fc-7317f9327c09\") " pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-r8p6v" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.135821 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.135868 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.135928 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.136574 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e2fab779748b41d2d6bca28ee35caff1c948d4988b65a4308383bcd22a0a32a5"} pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.136630 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" containerID="cri-o://e2fab779748b41d2d6bca28ee35caff1c948d4988b65a4308383bcd22a0a32a5" gracePeriod=600 Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.143053 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xd48q"] Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.143962 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xd48q" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.144638 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gscnd\" (UniqueName: \"kubernetes.io/projected/2f9a8b75-2e17-43ce-be88-dbc6f7ec0cb1-kube-api-access-gscnd\") pod \"telemetry-operator-controller-manager-97d456b9-d2kbz\" (UID: \"2f9a8b75-2e17-43ce-be88-dbc6f7ec0cb1\") " pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-d2kbz" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.151978 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqncv\" (UniqueName: \"kubernetes.io/projected/14102b10-a3ba-4f16-9928-4f41426a435f-kube-api-access-hqncv\") pod \"watcher-operator-controller-manager-55f78b7c4c-bl86c\" (UID: \"14102b10-a3ba-4f16-9928-4f41426a435f\") " pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-bl86c" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.152404 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-hrv2z" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.156201 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlrdl\" (UniqueName: \"kubernetes.io/projected/bd6aff58-984e-4106-acb0-c689f6e31832-kube-api-access-zlrdl\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.166610 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xd48q"] Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.202530 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-7fmnw" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.218598 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.218650 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-metrics-certs\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:03 crc kubenswrapper[4775]: E1216 15:12:03.218833 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 15:12:03 crc kubenswrapper[4775]: E1216 15:12:03.218843 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 15:12:03 crc kubenswrapper[4775]: E1216 15:12:03.218910 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-metrics-certs podName:bd6aff58-984e-4106-acb0-c689f6e31832 nodeName:}" failed. No retries permitted until 2025-12-16 15:12:04.218873162 +0000 UTC m=+1049.169952075 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-metrics-certs") pod "openstack-operator-controller-manager-7bc9b98d8-rvdbc" (UID: "bd6aff58-984e-4106-acb0-c689f6e31832") : secret "metrics-server-cert" not found Dec 16 15:12:03 crc kubenswrapper[4775]: E1216 15:12:03.219014 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs podName:bd6aff58-984e-4106-acb0-c689f6e31832 nodeName:}" failed. No retries permitted until 2025-12-16 15:12:04.218966475 +0000 UTC m=+1049.170045468 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs") pod "openstack-operator-controller-manager-7bc9b98d8-rvdbc" (UID: "bd6aff58-984e-4106-acb0-c689f6e31832") : secret "webhook-server-cert" not found Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.358656 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp5ls\" (UniqueName: \"kubernetes.io/projected/d132ccba-b1e9-4f8c-8129-1087a1a672b9-kube-api-access-sp5ls\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xd48q\" (UID: \"d132ccba-b1e9-4f8c-8129-1087a1a672b9\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xd48q" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.383137 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-bl86c" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.399415 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-r8p6v" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.435973 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-95949466-9m4t8"] Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.439779 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-d2kbz" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.477782 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5f98b4754f-5gxtk"] Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.488910 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp5ls\" (UniqueName: \"kubernetes.io/projected/d132ccba-b1e9-4f8c-8129-1087a1a672b9-kube-api-access-sp5ls\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xd48q\" (UID: \"d132ccba-b1e9-4f8c-8129-1087a1a672b9\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xd48q" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.545913 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp5ls\" (UniqueName: \"kubernetes.io/projected/d132ccba-b1e9-4f8c-8129-1087a1a672b9-kube-api-access-sp5ls\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xd48q\" (UID: \"d132ccba-b1e9-4f8c-8129-1087a1a672b9\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xd48q" Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.594326 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be824423-7753-4920-8aa7-93d2904280fb-cert\") pod \"infra-operator-controller-manager-6558fdd56c-jc4nj\" (UID: \"be824423-7753-4920-8aa7-93d2904280fb\") " pod="openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj" Dec 16 15:12:03 crc kubenswrapper[4775]: E1216 15:12:03.595185 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 15:12:03 crc kubenswrapper[4775]: E1216 15:12:03.595328 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be824423-7753-4920-8aa7-93d2904280fb-cert podName:be824423-7753-4920-8aa7-93d2904280fb nodeName:}" failed. No retries permitted until 2025-12-16 15:12:05.595304594 +0000 UTC m=+1050.546383517 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be824423-7753-4920-8aa7-93d2904280fb-cert") pod "infra-operator-controller-manager-6558fdd56c-jc4nj" (UID: "be824423-7753-4920-8aa7-93d2904280fb") : secret "infra-operator-webhook-server-cert" not found Dec 16 15:12:03 crc kubenswrapper[4775]: I1216 15:12:03.605854 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xd48q" Dec 16 15:12:04 crc kubenswrapper[4775]: I1216 15:12:04.234225 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:04 crc kubenswrapper[4775]: I1216 15:12:04.234657 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-metrics-certs\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:04 crc kubenswrapper[4775]: I1216 15:12:04.234714 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/275767d8-4eed-4a90-8d43-348c607ee37e-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb\" (UID: \"275767d8-4eed-4a90-8d43-348c607ee37e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" Dec 16 15:12:04 crc kubenswrapper[4775]: E1216 15:12:04.234833 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:12:04 crc kubenswrapper[4775]: E1216 15:12:04.234904 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275767d8-4eed-4a90-8d43-348c607ee37e-cert podName:275767d8-4eed-4a90-8d43-348c607ee37e nodeName:}" failed. No retries permitted until 2025-12-16 15:12:06.234869554 +0000 UTC m=+1051.185948477 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/275767d8-4eed-4a90-8d43-348c607ee37e-cert") pod "openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" (UID: "275767d8-4eed-4a90-8d43-348c607ee37e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:12:04 crc kubenswrapper[4775]: E1216 15:12:04.235353 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 15:12:04 crc kubenswrapper[4775]: E1216 15:12:04.235390 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs podName:bd6aff58-984e-4106-acb0-c689f6e31832 nodeName:}" failed. No retries permitted until 2025-12-16 15:12:06.235379731 +0000 UTC m=+1051.186458654 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs") pod "openstack-operator-controller-manager-7bc9b98d8-rvdbc" (UID: "bd6aff58-984e-4106-acb0-c689f6e31832") : secret "webhook-server-cert" not found Dec 16 15:12:04 crc kubenswrapper[4775]: E1216 15:12:04.235424 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 15:12:04 crc kubenswrapper[4775]: E1216 15:12:04.235443 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-metrics-certs podName:bd6aff58-984e-4106-acb0-c689f6e31832 nodeName:}" failed. No retries permitted until 2025-12-16 15:12:06.235437633 +0000 UTC m=+1051.186516556 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-metrics-certs") pod "openstack-operator-controller-manager-7bc9b98d8-rvdbc" (UID: "bd6aff58-984e-4106-acb0-c689f6e31832") : secret "metrics-server-cert" not found Dec 16 15:12:04 crc kubenswrapper[4775]: I1216 15:12:04.413767 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-5gxtk" event={"ID":"e002ee65-47de-44d4-864e-531283c322f7","Type":"ContainerStarted","Data":"436ced3aa6c968b28742c5308feb1715340442dd7d153d6c2bd0949e10be345d"} Dec 16 15:12:04 crc kubenswrapper[4775]: I1216 15:12:04.415355 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-95949466-9m4t8" event={"ID":"03a9286d-3fd3-4ec6-9a1d-fb8d613f401e","Type":"ContainerStarted","Data":"8fb8eaa446b314c6f22078ef1bf907dc621b2dca1e5cef11d53c3e31794cd621"} Dec 16 15:12:04 crc kubenswrapper[4775]: I1216 15:12:04.554371 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-767f9d7567-dqpmk"] Dec 16 15:12:04 crc kubenswrapper[4775]: I1216 15:12:04.571400 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-dfdrn"] Dec 16 15:12:04 crc kubenswrapper[4775]: I1216 15:12:04.585410 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f458558d7-9b8tb"] Dec 16 15:12:04 crc kubenswrapper[4775]: W1216 15:12:04.600787 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e002a19_f4ca_4186_940c_321834e88e5e.slice/crio-e766530524fd839b5297c21527ae7d891d3dc6e674bc7bfd46d4627cfe497d39 WatchSource:0}: Error finding container e766530524fd839b5297c21527ae7d891d3dc6e674bc7bfd46d4627cfe497d39: Status 404 returned error can't find the container with id e766530524fd839b5297c21527ae7d891d3dc6e674bc7bfd46d4627cfe497d39 Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.097953 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fdd9786f7-gcfkm"] Dec 16 15:12:05 crc kubenswrapper[4775]: W1216 15:12:05.129600 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf82f14a2_7460_4a06_978b_d22d9ad7d6bd.slice/crio-0d432839863ad09c33c4be0ab9d714252d45e26990922686e3483e16cf0ad13a WatchSource:0}: Error finding container 0d432839863ad09c33c4be0ab9d714252d45e26990922686e3483e16cf0ad13a: Status 404 returned error can't find the container with id 0d432839863ad09c33c4be0ab9d714252d45e26990922686e3483e16cf0ad13a Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.130740 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccf486b9-pk5fg"] Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.171837 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-97d456b9-d2kbz"] Dec 16 15:12:05 crc kubenswrapper[4775]: W1216 15:12:05.206311 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f9a8b75_2e17_43ce_be88_dbc6f7ec0cb1.slice/crio-1fd7a9a92015d1bb8b3636be342978e1ab076f96769886ea527fcdef3da60d82 WatchSource:0}: Error finding container 1fd7a9a92015d1bb8b3636be342978e1ab076f96769886ea527fcdef3da60d82: Status 404 returned error can't find the container with id 1fd7a9a92015d1bb8b3636be342978e1ab076f96769886ea527fcdef3da60d82 Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.253771 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c7cbf548f-lph76"] Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.434132 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-gcfkm" event={"ID":"f82f14a2-7460-4a06-978b-d22d9ad7d6bd","Type":"ContainerStarted","Data":"0d432839863ad09c33c4be0ab9d714252d45e26990922686e3483e16cf0ad13a"} Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.446400 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-dfdrn" event={"ID":"d0dab2aa-577b-4a9d-bcce-0530cbb3e4b6","Type":"ContainerStarted","Data":"1a4085b5a4660ca4dea788f5ac1acb7c247936b7db0f6b054ca7afc199c5cb3c"} Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.451922 4775 generic.go:334] "Generic (PLEG): container finished" podID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerID="e2fab779748b41d2d6bca28ee35caff1c948d4988b65a4308383bcd22a0a32a5" exitCode=0 Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.451991 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerDied","Data":"e2fab779748b41d2d6bca28ee35caff1c948d4988b65a4308383bcd22a0a32a5"} Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.452024 4775 scope.go:117] "RemoveContainer" containerID="790666d10a8413c7b1bed65625e744b82eacfed0c75d107b7bd78a845e4df70e" Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.454816 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-pk5fg" event={"ID":"a738c781-0876-490f-bf95-d7d77a6f2aff","Type":"ContainerStarted","Data":"5f8324bd785354a0450092d1e2b4c1bd75ad4229f479ce693b45b80c8532f094"} Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.463284 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-d2kbz" event={"ID":"2f9a8b75-2e17-43ce-be88-dbc6f7ec0cb1","Type":"ContainerStarted","Data":"1fd7a9a92015d1bb8b3636be342978e1ab076f96769886ea527fcdef3da60d82"} Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.486661 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-9b8tb" event={"ID":"d8873d69-8f0e-4816-b39e-bf8506282196","Type":"ContainerStarted","Data":"9f5a6917c8c69fa4f8cddecb27d8d7cb7098d4802086a8ab0c24d53f33614a65"} Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.504805 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-lph76" event={"ID":"eff249fe-7aa9-406b-a4f0-91d7891afc8b","Type":"ContainerStarted","Data":"ec4ffc5d40cd8979b2e875cd38c5a91de1336c1c39ec62b6285929952112ba14"} Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.507659 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-dqpmk" event={"ID":"8e002a19-f4ca-4186-940c-321834e88e5e","Type":"ContainerStarted","Data":"e766530524fd839b5297c21527ae7d891d3dc6e674bc7bfd46d4627cfe497d39"} Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.558344 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-d9rg9"] Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.570909 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-nj5th"] Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.583056 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-nvb99"] Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.583110 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55f78b7c4c-bl86c"] Dec 16 15:12:05 crc kubenswrapper[4775]: W1216 15:12:05.590538 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb70a54b3_3bc0_45e4_add9_d47b81371266.slice/crio-dd75fcea7aadd46520022a463e71b32dbb53cc80cfbfe5d28bd40c57400784f8 WatchSource:0}: Error finding container dd75fcea7aadd46520022a463e71b32dbb53cc80cfbfe5d28bd40c57400784f8: Status 404 returned error can't find the container with id dd75fcea7aadd46520022a463e71b32dbb53cc80cfbfe5d28bd40c57400784f8 Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.617897 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xd48q"] Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.642061 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5559d9665f-4hmbr"] Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.650412 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8665b56d78-7fmnw"] Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.666861 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be824423-7753-4920-8aa7-93d2904280fb-cert\") pod \"infra-operator-controller-manager-6558fdd56c-jc4nj\" (UID: \"be824423-7753-4920-8aa7-93d2904280fb\") " pod="openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj" Dec 16 15:12:05 crc kubenswrapper[4775]: E1216 15:12:05.667018 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 15:12:05 crc kubenswrapper[4775]: E1216 15:12:05.667094 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be824423-7753-4920-8aa7-93d2904280fb-cert podName:be824423-7753-4920-8aa7-93d2904280fb nodeName:}" failed. No retries permitted until 2025-12-16 15:12:09.667072245 +0000 UTC m=+1054.618151168 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be824423-7753-4920-8aa7-93d2904280fb-cert") pod "infra-operator-controller-manager-6558fdd56c-jc4nj" (UID: "be824423-7753-4920-8aa7-93d2904280fb") : secret "infra-operator-webhook-server-cert" not found Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.856230 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f76f4954c-47s9s"] Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.881629 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c6df8f9-r8p6v"] Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.887079 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-qmqgx"] Dec 16 15:12:05 crc kubenswrapper[4775]: I1216 15:12:05.899753 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-756ccf86c7-4mncx"] Dec 16 15:12:05 crc kubenswrapper[4775]: E1216 15:12:05.933142 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jrlpl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-7cd87b778f-qmqgx_openstack-operators(1723eb19-5ef2-43d0-a1f8-590e89eb5f87): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 15:12:05 crc kubenswrapper[4775]: E1216 15:12:05.934357 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-qmqgx" podUID="1723eb19-5ef2-43d0-a1f8-590e89eb5f87" Dec 16 15:12:05 crc kubenswrapper[4775]: E1216 15:12:05.963392 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lmlpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-756ccf86c7-4mncx_openstack-operators(f05c78d5-d86c-42de-9eee-e8d09204a0b4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 15:12:05 crc kubenswrapper[4775]: E1216 15:12:05.964758 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-4mncx" podUID="f05c78d5-d86c-42de-9eee-e8d09204a0b4" Dec 16 15:12:06 crc kubenswrapper[4775]: I1216 15:12:06.285239 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:06 crc kubenswrapper[4775]: I1216 15:12:06.285307 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-metrics-certs\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:06 crc kubenswrapper[4775]: I1216 15:12:06.285355 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/275767d8-4eed-4a90-8d43-348c607ee37e-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb\" (UID: \"275767d8-4eed-4a90-8d43-348c607ee37e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" Dec 16 15:12:06 crc kubenswrapper[4775]: E1216 15:12:06.285410 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 15:12:06 crc kubenswrapper[4775]: E1216 15:12:06.285446 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:12:06 crc kubenswrapper[4775]: E1216 15:12:06.285487 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275767d8-4eed-4a90-8d43-348c607ee37e-cert podName:275767d8-4eed-4a90-8d43-348c607ee37e nodeName:}" failed. No retries permitted until 2025-12-16 15:12:10.285470516 +0000 UTC m=+1055.236549439 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/275767d8-4eed-4a90-8d43-348c607ee37e-cert") pod "openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" (UID: "275767d8-4eed-4a90-8d43-348c607ee37e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:12:06 crc kubenswrapper[4775]: E1216 15:12:06.285502 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs podName:bd6aff58-984e-4106-acb0-c689f6e31832 nodeName:}" failed. No retries permitted until 2025-12-16 15:12:10.285496227 +0000 UTC m=+1055.236575150 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs") pod "openstack-operator-controller-manager-7bc9b98d8-rvdbc" (UID: "bd6aff58-984e-4106-acb0-c689f6e31832") : secret "webhook-server-cert" not found Dec 16 15:12:06 crc kubenswrapper[4775]: E1216 15:12:06.285539 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 15:12:06 crc kubenswrapper[4775]: E1216 15:12:06.285559 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-metrics-certs podName:bd6aff58-984e-4106-acb0-c689f6e31832 nodeName:}" failed. No retries permitted until 2025-12-16 15:12:10.285553809 +0000 UTC m=+1055.236632732 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-metrics-certs") pod "openstack-operator-controller-manager-7bc9b98d8-rvdbc" (UID: "bd6aff58-984e-4106-acb0-c689f6e31832") : secret "metrics-server-cert" not found Dec 16 15:12:06 crc kubenswrapper[4775]: I1216 15:12:06.540184 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerStarted","Data":"1a2e1ee11401b69f007c34bf8d81d1b271d0b2639b666040ec08a76eb20c628c"} Dec 16 15:12:06 crc kubenswrapper[4775]: I1216 15:12:06.565596 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-d9rg9" event={"ID":"b70a54b3-3bc0-45e4-add9-d47b81371266","Type":"ContainerStarted","Data":"dd75fcea7aadd46520022a463e71b32dbb53cc80cfbfe5d28bd40c57400784f8"} Dec 16 15:12:06 crc kubenswrapper[4775]: I1216 15:12:06.572898 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-4mncx" event={"ID":"f05c78d5-d86c-42de-9eee-e8d09204a0b4","Type":"ContainerStarted","Data":"b110b68f0706f1111e45a4928ab8e06252fc57b36d59025f96e9fb5bf50e8201"} Dec 16 15:12:06 crc kubenswrapper[4775]: E1216 15:12:06.576701 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-4mncx" podUID="f05c78d5-d86c-42de-9eee-e8d09204a0b4" Dec 16 15:12:06 crc kubenswrapper[4775]: I1216 15:12:06.577875 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5559d9665f-4hmbr" event={"ID":"c5962fcc-3c3b-435a-b848-237af19ce258","Type":"ContainerStarted","Data":"3a682a5d5cff23b44789234772d51887229ff1ce2a01d421c70559a6490222f8"} Dec 16 15:12:06 crc kubenswrapper[4775]: I1216 15:12:06.582126 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-bl86c" event={"ID":"14102b10-a3ba-4f16-9928-4f41426a435f","Type":"ContainerStarted","Data":"ab3fc2b4615702708ca26b71716f8be0fa5994d4e0abdde9157ee8881a6b7337"} Dec 16 15:12:06 crc kubenswrapper[4775]: I1216 15:12:06.584045 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-nvb99" event={"ID":"63c035e4-8ff2-49a4-94d9-57c65a71494b","Type":"ContainerStarted","Data":"2d51f232452e5fa4e884dc63f8bbbc4f8218cca2eedbe6d23616022fa30607fe"} Dec 16 15:12:06 crc kubenswrapper[4775]: I1216 15:12:06.585801 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-7fmnw" event={"ID":"19d1c138-c230-44b2-972c-c557693054f5","Type":"ContainerStarted","Data":"34dcb9ac9cc3e2a376fea79c94b9bcbb90c935c583b3e05bf21882b8fd0fbe54"} Dec 16 15:12:06 crc kubenswrapper[4775]: I1216 15:12:06.587829 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-qmqgx" event={"ID":"1723eb19-5ef2-43d0-a1f8-590e89eb5f87","Type":"ContainerStarted","Data":"9cbe25e552a9e0f88c1fe883780f9be2d36eaa0aa8659ca1b52b3f9119e1d056"} Dec 16 15:12:06 crc kubenswrapper[4775]: E1216 15:12:06.600450 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-qmqgx" podUID="1723eb19-5ef2-43d0-a1f8-590e89eb5f87" Dec 16 15:12:06 crc kubenswrapper[4775]: I1216 15:12:06.636622 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xd48q" event={"ID":"d132ccba-b1e9-4f8c-8129-1087a1a672b9","Type":"ContainerStarted","Data":"ed00cbe7e7450bdfe995d677567b2cf34f12771c12377210a0a7c7a8c7cb7865"} Dec 16 15:12:06 crc kubenswrapper[4775]: I1216 15:12:06.639447 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-47s9s" event={"ID":"4fbf17e0-d42f-463b-9f01-a39d842812ff","Type":"ContainerStarted","Data":"c448f9e43ee082d90372ca8dbecb12c05074826d3ea0a86fff5934d11bba1346"} Dec 16 15:12:06 crc kubenswrapper[4775]: I1216 15:12:06.644170 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-r8p6v" event={"ID":"85cc53cf-83a7-4810-b0fc-7317f9327c09","Type":"ContainerStarted","Data":"da5e42406132fb4dfcd9bcc8ed7046b22acc930f9aab8406444d108824f85df4"} Dec 16 15:12:06 crc kubenswrapper[4775]: I1216 15:12:06.646212 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-nj5th" event={"ID":"11012716-6e3c-4b17-97c7-16e723ad1092","Type":"ContainerStarted","Data":"df398329dc26a5978e8d284524034ce1c853cd22e565b1089149118b55bf46b9"} Dec 16 15:12:07 crc kubenswrapper[4775]: E1216 15:12:07.672643 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-qmqgx" podUID="1723eb19-5ef2-43d0-a1f8-590e89eb5f87" Dec 16 15:12:07 crc kubenswrapper[4775]: E1216 15:12:07.675189 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-4mncx" podUID="f05c78d5-d86c-42de-9eee-e8d09204a0b4" Dec 16 15:12:09 crc kubenswrapper[4775]: I1216 15:12:09.694594 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be824423-7753-4920-8aa7-93d2904280fb-cert\") pod \"infra-operator-controller-manager-6558fdd56c-jc4nj\" (UID: \"be824423-7753-4920-8aa7-93d2904280fb\") " pod="openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj" Dec 16 15:12:09 crc kubenswrapper[4775]: E1216 15:12:09.694744 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 15:12:09 crc kubenswrapper[4775]: E1216 15:12:09.695148 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be824423-7753-4920-8aa7-93d2904280fb-cert podName:be824423-7753-4920-8aa7-93d2904280fb nodeName:}" failed. No retries permitted until 2025-12-16 15:12:17.695109242 +0000 UTC m=+1062.646188165 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be824423-7753-4920-8aa7-93d2904280fb-cert") pod "infra-operator-controller-manager-6558fdd56c-jc4nj" (UID: "be824423-7753-4920-8aa7-93d2904280fb") : secret "infra-operator-webhook-server-cert" not found Dec 16 15:12:10 crc kubenswrapper[4775]: I1216 15:12:10.313174 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/275767d8-4eed-4a90-8d43-348c607ee37e-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb\" (UID: \"275767d8-4eed-4a90-8d43-348c607ee37e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" Dec 16 15:12:10 crc kubenswrapper[4775]: I1216 15:12:10.313587 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:10 crc kubenswrapper[4775]: I1216 15:12:10.313629 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-metrics-certs\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:10 crc kubenswrapper[4775]: E1216 15:12:10.313403 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:12:10 crc kubenswrapper[4775]: E1216 15:12:10.313775 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275767d8-4eed-4a90-8d43-348c607ee37e-cert podName:275767d8-4eed-4a90-8d43-348c607ee37e nodeName:}" failed. No retries permitted until 2025-12-16 15:12:18.313757692 +0000 UTC m=+1063.264836615 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/275767d8-4eed-4a90-8d43-348c607ee37e-cert") pod "openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" (UID: "275767d8-4eed-4a90-8d43-348c607ee37e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:12:10 crc kubenswrapper[4775]: E1216 15:12:10.313822 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 15:12:10 crc kubenswrapper[4775]: E1216 15:12:10.313844 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs podName:bd6aff58-984e-4106-acb0-c689f6e31832 nodeName:}" failed. No retries permitted until 2025-12-16 15:12:18.313837784 +0000 UTC m=+1063.264916707 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs") pod "openstack-operator-controller-manager-7bc9b98d8-rvdbc" (UID: "bd6aff58-984e-4106-acb0-c689f6e31832") : secret "webhook-server-cert" not found Dec 16 15:12:10 crc kubenswrapper[4775]: E1216 15:12:10.313723 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 15:12:10 crc kubenswrapper[4775]: E1216 15:12:10.313865 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-metrics-certs podName:bd6aff58-984e-4106-acb0-c689f6e31832 nodeName:}" failed. No retries permitted until 2025-12-16 15:12:18.313860375 +0000 UTC m=+1063.264939298 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-metrics-certs") pod "openstack-operator-controller-manager-7bc9b98d8-rvdbc" (UID: "bd6aff58-984e-4106-acb0-c689f6e31832") : secret "metrics-server-cert" not found Dec 16 15:12:17 crc kubenswrapper[4775]: I1216 15:12:17.703674 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be824423-7753-4920-8aa7-93d2904280fb-cert\") pod \"infra-operator-controller-manager-6558fdd56c-jc4nj\" (UID: \"be824423-7753-4920-8aa7-93d2904280fb\") " pod="openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj" Dec 16 15:12:17 crc kubenswrapper[4775]: I1216 15:12:17.716839 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be824423-7753-4920-8aa7-93d2904280fb-cert\") pod \"infra-operator-controller-manager-6558fdd56c-jc4nj\" (UID: \"be824423-7753-4920-8aa7-93d2904280fb\") " pod="openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj" Dec 16 15:12:17 crc kubenswrapper[4775]: I1216 15:12:17.949311 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj" Dec 16 15:12:18 crc kubenswrapper[4775]: I1216 15:12:18.414712 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-metrics-certs\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:18 crc kubenswrapper[4775]: I1216 15:12:18.415607 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/275767d8-4eed-4a90-8d43-348c607ee37e-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb\" (UID: \"275767d8-4eed-4a90-8d43-348c607ee37e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" Dec 16 15:12:18 crc kubenswrapper[4775]: I1216 15:12:18.415948 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:18 crc kubenswrapper[4775]: E1216 15:12:18.416087 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 15:12:18 crc kubenswrapper[4775]: E1216 15:12:18.416155 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs podName:bd6aff58-984e-4106-acb0-c689f6e31832 nodeName:}" failed. No retries permitted until 2025-12-16 15:12:34.416136562 +0000 UTC m=+1079.367215475 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs") pod "openstack-operator-controller-manager-7bc9b98d8-rvdbc" (UID: "bd6aff58-984e-4106-acb0-c689f6e31832") : secret "webhook-server-cert" not found Dec 16 15:12:18 crc kubenswrapper[4775]: I1216 15:12:18.423658 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-metrics-certs\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:18 crc kubenswrapper[4775]: I1216 15:12:18.424720 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/275767d8-4eed-4a90-8d43-348c607ee37e-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb\" (UID: \"275767d8-4eed-4a90-8d43-348c607ee37e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" Dec 16 15:12:18 crc kubenswrapper[4775]: I1216 15:12:18.648153 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" Dec 16 15:12:19 crc kubenswrapper[4775]: E1216 15:12:19.997039 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:5639a8e1bbc8006cf0797de49b4c063c3531972e476c2257889bb66dac7fad8a" Dec 16 15:12:19 crc kubenswrapper[4775]: E1216 15:12:19.997675 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:5639a8e1bbc8006cf0797de49b4c063c3531972e476c2257889bb66dac7fad8a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-99c2t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-5f98b4754f-5gxtk_openstack-operators(e002ee65-47de-44d4-864e-531283c322f7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:19 crc kubenswrapper[4775]: E1216 15:12:19.999036 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-5gxtk" podUID="e002ee65-47de-44d4-864e-531283c322f7" Dec 16 15:12:20 crc kubenswrapper[4775]: E1216 15:12:20.872053 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:5639a8e1bbc8006cf0797de49b4c063c3531972e476c2257889bb66dac7fad8a\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-5gxtk" podUID="e002ee65-47de-44d4-864e-531283c322f7" Dec 16 15:12:22 crc kubenswrapper[4775]: E1216 15:12:22.310845 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027" Dec 16 15:12:22 crc kubenswrapper[4775]: E1216 15:12:22.311434 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5nvvr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-767f9d7567-dqpmk_openstack-operators(8e002a19-f4ca-4186-940c-321834e88e5e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:22 crc kubenswrapper[4775]: E1216 15:12:22.312630 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-dqpmk" podUID="8e002a19-f4ca-4186-940c-321834e88e5e" Dec 16 15:12:22 crc kubenswrapper[4775]: E1216 15:12:22.884139 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027\\\"\"" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-dqpmk" podUID="8e002a19-f4ca-4186-940c-321834e88e5e" Dec 16 15:12:23 crc kubenswrapper[4775]: E1216 15:12:23.214631 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 16 15:12:23 crc kubenswrapper[4775]: E1216 15:12:23.215047 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mnv78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-6ccf486b9-pk5fg_openstack-operators(a738c781-0876-490f-bf95-d7d77a6f2aff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:23 crc kubenswrapper[4775]: E1216 15:12:23.216239 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-pk5fg" podUID="a738c781-0876-490f-bf95-d7d77a6f2aff" Dec 16 15:12:23 crc kubenswrapper[4775]: E1216 15:12:23.928155 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-pk5fg" podUID="a738c781-0876-490f-bf95-d7d77a6f2aff" Dec 16 15:12:24 crc kubenswrapper[4775]: E1216 15:12:24.277663 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a" Dec 16 15:12:24 crc kubenswrapper[4775]: E1216 15:12:24.277868 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sj5tp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-66f8b87655-dfdrn_openstack-operators(d0dab2aa-577b-4a9d-bcce-0530cbb3e4b6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:24 crc kubenswrapper[4775]: E1216 15:12:24.279291 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-dfdrn" podUID="d0dab2aa-577b-4a9d-bcce-0530cbb3e4b6" Dec 16 15:12:24 crc kubenswrapper[4775]: E1216 15:12:24.936053 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a\\\"\"" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-dfdrn" podUID="d0dab2aa-577b-4a9d-bcce-0530cbb3e4b6" Dec 16 15:12:25 crc kubenswrapper[4775]: E1216 15:12:25.166493 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 16 15:12:25 crc kubenswrapper[4775]: E1216 15:12:25.166761 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bxwsr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bf6d4f946-d9rg9_openstack-operators(b70a54b3-3bc0-45e4-add9-d47b81371266): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:25 crc kubenswrapper[4775]: E1216 15:12:25.167983 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-d9rg9" podUID="b70a54b3-3bc0-45e4-add9-d47b81371266" Dec 16 15:12:25 crc kubenswrapper[4775]: E1216 15:12:25.962032 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-d9rg9" podUID="b70a54b3-3bc0-45e4-add9-d47b81371266" Dec 16 15:12:26 crc kubenswrapper[4775]: E1216 15:12:26.089980 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 16 15:12:26 crc kubenswrapper[4775]: E1216 15:12:26.090235 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sp2dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8665b56d78-7fmnw_openstack-operators(19d1c138-c230-44b2-972c-c557693054f5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:26 crc kubenswrapper[4775]: E1216 15:12:26.091732 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-7fmnw" podUID="19d1c138-c230-44b2-972c-c557693054f5" Dec 16 15:12:26 crc kubenswrapper[4775]: E1216 15:12:26.963548 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-7fmnw" podUID="19d1c138-c230-44b2-972c-c557693054f5" Dec 16 15:12:27 crc kubenswrapper[4775]: E1216 15:12:27.080384 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f" Dec 16 15:12:27 crc kubenswrapper[4775]: E1216 15:12:27.080596 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gscnd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-97d456b9-d2kbz_openstack-operators(2f9a8b75-2e17-43ce-be88-dbc6f7ec0cb1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:27 crc kubenswrapper[4775]: E1216 15:12:27.081790 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-d2kbz" podUID="2f9a8b75-2e17-43ce-be88-dbc6f7ec0cb1" Dec 16 15:12:27 crc kubenswrapper[4775]: E1216 15:12:27.923120 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a" Dec 16 15:12:27 crc kubenswrapper[4775]: E1216 15:12:27.923370 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kq4w4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-5fdd9786f7-gcfkm_openstack-operators(f82f14a2-7460-4a06-978b-d22d9ad7d6bd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:27 crc kubenswrapper[4775]: E1216 15:12:27.924804 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-gcfkm" podUID="f82f14a2-7460-4a06-978b-d22d9ad7d6bd" Dec 16 15:12:27 crc kubenswrapper[4775]: E1216 15:12:27.968289 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-d2kbz" podUID="2f9a8b75-2e17-43ce-be88-dbc6f7ec0cb1" Dec 16 15:12:27 crc kubenswrapper[4775]: E1216 15:12:27.969180 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a\\\"\"" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-gcfkm" podUID="f82f14a2-7460-4a06-978b-d22d9ad7d6bd" Dec 16 15:12:28 crc kubenswrapper[4775]: E1216 15:12:28.654388 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 16 15:12:28 crc kubenswrapper[4775]: E1216 15:12:28.654627 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-75r2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-68c649d9d-nj5th_openstack-operators(11012716-6e3c-4b17-97c7-16e723ad1092): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:28 crc kubenswrapper[4775]: E1216 15:12:28.656315 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-nj5th" podUID="11012716-6e3c-4b17-97c7-16e723ad1092" Dec 16 15:12:28 crc kubenswrapper[4775]: E1216 15:12:28.973463 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-nj5th" podUID="11012716-6e3c-4b17-97c7-16e723ad1092" Dec 16 15:12:29 crc kubenswrapper[4775]: E1216 15:12:29.602123 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991" Dec 16 15:12:29 crc kubenswrapper[4775]: E1216 15:12:29.602775 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4sxqs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5c6df8f9-r8p6v_openstack-operators(85cc53cf-83a7-4810-b0fc-7317f9327c09): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:29 crc kubenswrapper[4775]: E1216 15:12:29.604435 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-r8p6v" podUID="85cc53cf-83a7-4810-b0fc-7317f9327c09" Dec 16 15:12:29 crc kubenswrapper[4775]: E1216 15:12:29.980022 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-r8p6v" podUID="85cc53cf-83a7-4810-b0fc-7317f9327c09" Dec 16 15:12:30 crc kubenswrapper[4775]: E1216 15:12:30.354546 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a" Dec 16 15:12:30 crc kubenswrapper[4775]: E1216 15:12:30.354880 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hqncv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-55f78b7c4c-bl86c_openstack-operators(14102b10-a3ba-4f16-9928-4f41426a435f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:30 crc kubenswrapper[4775]: E1216 15:12:30.356086 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-bl86c" podUID="14102b10-a3ba-4f16-9928-4f41426a435f" Dec 16 15:12:30 crc kubenswrapper[4775]: E1216 15:12:30.985371 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-bl86c" podUID="14102b10-a3ba-4f16-9928-4f41426a435f" Dec 16 15:12:31 crc kubenswrapper[4775]: E1216 15:12:31.046305 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87" Dec 16 15:12:31 crc kubenswrapper[4775]: E1216 15:12:31.046593 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-79rm7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-f458558d7-9b8tb_openstack-operators(d8873d69-8f0e-4816-b39e-bf8506282196): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:31 crc kubenswrapper[4775]: E1216 15:12:31.048335 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-9b8tb" podUID="d8873d69-8f0e-4816-b39e-bf8506282196" Dec 16 15:12:31 crc kubenswrapper[4775]: I1216 15:12:31.339802 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 15:12:31 crc kubenswrapper[4775]: E1216 15:12:31.993583 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-9b8tb" podUID="d8873d69-8f0e-4816-b39e-bf8506282196" Dec 16 15:12:33 crc kubenswrapper[4775]: E1216 15:12:33.505753 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.102:5001/openstack-k8s-operators/heat-operator:02a7b30265dace4146f6262c276624e7e19b3ecc" Dec 16 15:12:33 crc kubenswrapper[4775]: E1216 15:12:33.506173 4775 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.102:5001/openstack-k8s-operators/heat-operator:02a7b30265dace4146f6262c276624e7e19b3ecc" Dec 16 15:12:33 crc kubenswrapper[4775]: E1216 15:12:33.506419 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.102:5001/openstack-k8s-operators/heat-operator:02a7b30265dace4146f6262c276624e7e19b3ecc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nv54p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5559d9665f-4hmbr_openstack-operators(c5962fcc-3c3b-435a-b848-237af19ce258): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:33 crc kubenswrapper[4775]: E1216 15:12:33.507595 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5559d9665f-4hmbr" podUID="c5962fcc-3c3b-435a-b848-237af19ce258" Dec 16 15:12:33 crc kubenswrapper[4775]: E1216 15:12:33.984909 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 16 15:12:33 crc kubenswrapper[4775]: E1216 15:12:33.985145 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lmlpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-756ccf86c7-4mncx_openstack-operators(f05c78d5-d86c-42de-9eee-e8d09204a0b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:33 crc kubenswrapper[4775]: E1216 15:12:33.986299 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-4mncx" podUID="f05c78d5-d86c-42de-9eee-e8d09204a0b4" Dec 16 15:12:34 crc kubenswrapper[4775]: E1216 15:12:34.004625 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.102:5001/openstack-k8s-operators/heat-operator:02a7b30265dace4146f6262c276624e7e19b3ecc\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5559d9665f-4hmbr" podUID="c5962fcc-3c3b-435a-b848-237af19ce258" Dec 16 15:12:34 crc kubenswrapper[4775]: I1216 15:12:34.512414 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:34 crc kubenswrapper[4775]: I1216 15:12:34.520457 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd6aff58-984e-4106-acb0-c689f6e31832-webhook-certs\") pod \"openstack-operator-controller-manager-7bc9b98d8-rvdbc\" (UID: \"bd6aff58-984e-4106-acb0-c689f6e31832\") " pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:34 crc kubenswrapper[4775]: I1216 15:12:34.789807 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:35 crc kubenswrapper[4775]: E1216 15:12:35.249578 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 16 15:12:35 crc kubenswrapper[4775]: E1216 15:12:35.249832 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sp5ls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-xd48q_openstack-operators(d132ccba-b1e9-4f8c-8129-1087a1a672b9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:35 crc kubenswrapper[4775]: E1216 15:12:35.251038 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xd48q" podUID="d132ccba-b1e9-4f8c-8129-1087a1a672b9" Dec 16 15:12:35 crc kubenswrapper[4775]: E1216 15:12:35.766612 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 16 15:12:35 crc kubenswrapper[4775]: E1216 15:12:35.766814 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r2hxc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-5c7cbf548f-lph76_openstack-operators(eff249fe-7aa9-406b-a4f0-91d7891afc8b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:35 crc kubenswrapper[4775]: E1216 15:12:35.768276 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-lph76" podUID="eff249fe-7aa9-406b-a4f0-91d7891afc8b" Dec 16 15:12:36 crc kubenswrapper[4775]: E1216 15:12:36.022458 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xd48q" podUID="d132ccba-b1e9-4f8c-8129-1087a1a672b9" Dec 16 15:12:36 crc kubenswrapper[4775]: E1216 15:12:36.028998 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-lph76" podUID="eff249fe-7aa9-406b-a4f0-91d7891afc8b" Dec 16 15:12:36 crc kubenswrapper[4775]: E1216 15:12:36.286032 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 16 15:12:36 crc kubenswrapper[4775]: E1216 15:12:36.286312 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-59kjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5fbbf8b6cc-nvb99_openstack-operators(63c035e4-8ff2-49a4-94d9-57c65a71494b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:36 crc kubenswrapper[4775]: E1216 15:12:36.287601 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-nvb99" podUID="63c035e4-8ff2-49a4-94d9-57c65a71494b" Dec 16 15:12:36 crc kubenswrapper[4775]: I1216 15:12:36.682239 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc"] Dec 16 15:12:36 crc kubenswrapper[4775]: I1216 15:12:36.738685 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj"] Dec 16 15:12:36 crc kubenswrapper[4775]: W1216 15:12:36.747335 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe824423_7753_4920_8aa7_93d2904280fb.slice/crio-d5d9820e1c6ddd95dc8602d55fae9b1341d5bd0da836e75f3b2dcfd7a268080b WatchSource:0}: Error finding container d5d9820e1c6ddd95dc8602d55fae9b1341d5bd0da836e75f3b2dcfd7a268080b: Status 404 returned error can't find the container with id d5d9820e1c6ddd95dc8602d55fae9b1341d5bd0da836e75f3b2dcfd7a268080b Dec 16 15:12:36 crc kubenswrapper[4775]: I1216 15:12:36.833293 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb"] Dec 16 15:12:36 crc kubenswrapper[4775]: W1216 15:12:36.841103 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod275767d8_4eed_4a90_8d43_348c607ee37e.slice/crio-96d6cbf36ad4b482ff821566840534e91478c77e042781a176d63acf45002132 WatchSource:0}: Error finding container 96d6cbf36ad4b482ff821566840534e91478c77e042781a176d63acf45002132: Status 404 returned error can't find the container with id 96d6cbf36ad4b482ff821566840534e91478c77e042781a176d63acf45002132 Dec 16 15:12:37 crc kubenswrapper[4775]: I1216 15:12:37.026526 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-qmqgx" event={"ID":"1723eb19-5ef2-43d0-a1f8-590e89eb5f87","Type":"ContainerStarted","Data":"0e53528810e0557a622d436319aaac15b8d6d88c2c458cf0ebe51707e23517c8"} Dec 16 15:12:37 crc kubenswrapper[4775]: I1216 15:12:37.026820 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-qmqgx" Dec 16 15:12:37 crc kubenswrapper[4775]: I1216 15:12:37.028718 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-47s9s" event={"ID":"4fbf17e0-d42f-463b-9f01-a39d842812ff","Type":"ContainerStarted","Data":"9104e012cb757cc268902580cd8de904d73ddf7da064d69bc5d8b7e400439c11"} Dec 16 15:12:37 crc kubenswrapper[4775]: I1216 15:12:37.028929 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-47s9s" Dec 16 15:12:37 crc kubenswrapper[4775]: I1216 15:12:37.029684 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" event={"ID":"275767d8-4eed-4a90-8d43-348c607ee37e","Type":"ContainerStarted","Data":"96d6cbf36ad4b482ff821566840534e91478c77e042781a176d63acf45002132"} Dec 16 15:12:37 crc kubenswrapper[4775]: I1216 15:12:37.030590 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj" event={"ID":"be824423-7753-4920-8aa7-93d2904280fb","Type":"ContainerStarted","Data":"d5d9820e1c6ddd95dc8602d55fae9b1341d5bd0da836e75f3b2dcfd7a268080b"} Dec 16 15:12:37 crc kubenswrapper[4775]: I1216 15:12:37.031728 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-95949466-9m4t8" event={"ID":"03a9286d-3fd3-4ec6-9a1d-fb8d613f401e","Type":"ContainerStarted","Data":"2e402fc0ff0ab1ed775031c4ca22e839d639ecb1845adce3aa5db05d226ee919"} Dec 16 15:12:37 crc kubenswrapper[4775]: I1216 15:12:37.031867 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-95949466-9m4t8" Dec 16 15:12:37 crc kubenswrapper[4775]: I1216 15:12:37.033450 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" event={"ID":"bd6aff58-984e-4106-acb0-c689f6e31832","Type":"ContainerStarted","Data":"1ccdb0cba9d860b81ef211518aad63a36674a78e8de1e7fe4b348224e8b24907"} Dec 16 15:12:37 crc kubenswrapper[4775]: I1216 15:12:37.033482 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" event={"ID":"bd6aff58-984e-4106-acb0-c689f6e31832","Type":"ContainerStarted","Data":"dd7994df838c2fbe493bcd73a5274f9fc6eee8ed710cec74da402d36135ee234"} Dec 16 15:12:37 crc kubenswrapper[4775]: I1216 15:12:37.034111 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:37 crc kubenswrapper[4775]: I1216 15:12:37.035631 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-5gxtk" event={"ID":"e002ee65-47de-44d4-864e-531283c322f7","Type":"ContainerStarted","Data":"831077433d9cf560277769ecbe1287013b16596520de1458b3c72d07848750dc"} Dec 16 15:12:37 crc kubenswrapper[4775]: I1216 15:12:37.036055 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-5gxtk" Dec 16 15:12:37 crc kubenswrapper[4775]: E1216 15:12:37.036475 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-nvb99" podUID="63c035e4-8ff2-49a4-94d9-57c65a71494b" Dec 16 15:12:37 crc kubenswrapper[4775]: I1216 15:12:37.044911 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-qmqgx" podStartSLOduration=5.674784812 podStartE2EDuration="36.044895298s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:05.932991253 +0000 UTC m=+1050.884070176" lastFinishedPulling="2025-12-16 15:12:36.303101739 +0000 UTC m=+1081.254180662" observedRunningTime="2025-12-16 15:12:37.042548573 +0000 UTC m=+1081.993627526" watchObservedRunningTime="2025-12-16 15:12:37.044895298 +0000 UTC m=+1081.995974221" Dec 16 15:12:37 crc kubenswrapper[4775]: I1216 15:12:37.072123 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" podStartSLOduration=35.072103941 podStartE2EDuration="35.072103941s" podCreationTimestamp="2025-12-16 15:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:12:37.066710848 +0000 UTC m=+1082.017789781" watchObservedRunningTime="2025-12-16 15:12:37.072103941 +0000 UTC m=+1082.023182864" Dec 16 15:12:37 crc kubenswrapper[4775]: I1216 15:12:37.085803 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-95949466-9m4t8" podStartSLOduration=5.237027753 podStartE2EDuration="36.085786109s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:03.85398283 +0000 UTC m=+1048.805061763" lastFinishedPulling="2025-12-16 15:12:34.702741156 +0000 UTC m=+1079.653820119" observedRunningTime="2025-12-16 15:12:37.082290107 +0000 UTC m=+1082.033369030" watchObservedRunningTime="2025-12-16 15:12:37.085786109 +0000 UTC m=+1082.036865032" Dec 16 15:12:37 crc kubenswrapper[4775]: I1216 15:12:37.125095 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-5gxtk" podStartSLOduration=3.680391232 podStartE2EDuration="36.125074979s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:03.853733132 +0000 UTC m=+1048.804812055" lastFinishedPulling="2025-12-16 15:12:36.298416879 +0000 UTC m=+1081.249495802" observedRunningTime="2025-12-16 15:12:37.121032599 +0000 UTC m=+1082.072111532" watchObservedRunningTime="2025-12-16 15:12:37.125074979 +0000 UTC m=+1082.076153902" Dec 16 15:12:37 crc kubenswrapper[4775]: I1216 15:12:37.158429 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-47s9s" podStartSLOduration=7.35981133 podStartE2EDuration="36.158404358s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:05.904093566 +0000 UTC m=+1050.855172489" lastFinishedPulling="2025-12-16 15:12:34.702686594 +0000 UTC m=+1079.653765517" observedRunningTime="2025-12-16 15:12:37.151492507 +0000 UTC m=+1082.102571440" watchObservedRunningTime="2025-12-16 15:12:37.158404358 +0000 UTC m=+1082.109483281" Dec 16 15:12:41 crc kubenswrapper[4775]: I1216 15:12:41.811640 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-95949466-9m4t8" Dec 16 15:12:41 crc kubenswrapper[4775]: I1216 15:12:41.834608 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-5gxtk" Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.153803 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj" event={"ID":"be824423-7753-4920-8aa7-93d2904280fb","Type":"ContainerStarted","Data":"8c3b2e2add8036bb672df4e858ff9a09231d0086b4ff808e0fc35b6a90c3b001"} Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.154187 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj" Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.156410 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-d9rg9" event={"ID":"b70a54b3-3bc0-45e4-add9-d47b81371266","Type":"ContainerStarted","Data":"b4f4db44d73e36eb537cb0c4d65a6ec8785c73ce6375362bc996ec97a136407f"} Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.156749 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-d9rg9" Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.159341 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-pk5fg" event={"ID":"a738c781-0876-490f-bf95-d7d77a6f2aff","Type":"ContainerStarted","Data":"24acf51e354033ed6b4d59b25c07a931118dd7908d74019e89c597c47d5bc5a9"} Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.159975 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-pk5fg" Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.162256 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-dqpmk" event={"ID":"8e002a19-f4ca-4186-940c-321834e88e5e","Type":"ContainerStarted","Data":"76f5607e92f4a2267bd848cf5b5a8342a7ed51a83fbb457ff29ffeda86714191"} Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.162796 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-dqpmk" Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.164244 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-gcfkm" event={"ID":"f82f14a2-7460-4a06-978b-d22d9ad7d6bd","Type":"ContainerStarted","Data":"94ee4499c96008bf8c124916ef5ff516cf78a33bff8bd25bfc08ca5a5147a24c"} Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.164699 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-gcfkm" Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.168123 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-dfdrn" event={"ID":"d0dab2aa-577b-4a9d-bcce-0530cbb3e4b6","Type":"ContainerStarted","Data":"5afe3f972b970425f9bf18d3ad72fdd40a135b91662b2fe5dbf081ddc6fd9e65"} Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.168372 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-dfdrn" Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.170697 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" event={"ID":"275767d8-4eed-4a90-8d43-348c607ee37e","Type":"ContainerStarted","Data":"2ef8f5641f7110367849efa287001d738b4ae5d24789755ed89a5c7aefc6fc0f"} Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.170909 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.180093 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj" podStartSLOduration=36.896131016 podStartE2EDuration="41.18007456s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:36.750616221 +0000 UTC m=+1081.701695144" lastFinishedPulling="2025-12-16 15:12:41.034559765 +0000 UTC m=+1085.985638688" observedRunningTime="2025-12-16 15:12:42.177800438 +0000 UTC m=+1087.128879371" watchObservedRunningTime="2025-12-16 15:12:42.18007456 +0000 UTC m=+1087.131153473" Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.239080 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" podStartSLOduration=37.049542195 podStartE2EDuration="41.239059302s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:36.844408938 +0000 UTC m=+1081.795487861" lastFinishedPulling="2025-12-16 15:12:41.033926045 +0000 UTC m=+1085.985004968" observedRunningTime="2025-12-16 15:12:42.236526011 +0000 UTC m=+1087.187604944" watchObservedRunningTime="2025-12-16 15:12:42.239059302 +0000 UTC m=+1087.190138225" Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.256915 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-dfdrn" podStartSLOduration=4.82190834 podStartE2EDuration="41.256870014s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:04.610718718 +0000 UTC m=+1049.561797641" lastFinishedPulling="2025-12-16 15:12:41.045680402 +0000 UTC m=+1085.996759315" observedRunningTime="2025-12-16 15:12:42.25519358 +0000 UTC m=+1087.206272503" watchObservedRunningTime="2025-12-16 15:12:42.256870014 +0000 UTC m=+1087.207948937" Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.275854 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-pk5fg" podStartSLOduration=5.774895622 podStartE2EDuration="41.275835752s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:05.166630206 +0000 UTC m=+1050.117709129" lastFinishedPulling="2025-12-16 15:12:40.667570336 +0000 UTC m=+1085.618649259" observedRunningTime="2025-12-16 15:12:42.272566517 +0000 UTC m=+1087.223645440" watchObservedRunningTime="2025-12-16 15:12:42.275835752 +0000 UTC m=+1087.226914675" Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.288733 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-dqpmk" podStartSLOduration=4.872597085 podStartE2EDuration="41.288714565s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:04.618213958 +0000 UTC m=+1049.569292881" lastFinishedPulling="2025-12-16 15:12:41.034331438 +0000 UTC m=+1085.985410361" observedRunningTime="2025-12-16 15:12:42.285031417 +0000 UTC m=+1087.236110340" watchObservedRunningTime="2025-12-16 15:12:42.288714565 +0000 UTC m=+1087.239793488" Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.308699 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-47s9s" Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.311541 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-d9rg9" podStartSLOduration=5.8820711679999995 podStartE2EDuration="41.311520666s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:05.604917671 +0000 UTC m=+1050.555996594" lastFinishedPulling="2025-12-16 15:12:41.034367169 +0000 UTC m=+1085.985446092" observedRunningTime="2025-12-16 15:12:42.305536345 +0000 UTC m=+1087.256615268" watchObservedRunningTime="2025-12-16 15:12:42.311520666 +0000 UTC m=+1087.262599589" Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.345511 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-qmqgx" Dec 16 15:12:42 crc kubenswrapper[4775]: I1216 15:12:42.350015 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-gcfkm" podStartSLOduration=5.448277358 podStartE2EDuration="41.3499837s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:05.132571654 +0000 UTC m=+1050.083650577" lastFinishedPulling="2025-12-16 15:12:41.034277996 +0000 UTC m=+1085.985356919" observedRunningTime="2025-12-16 15:12:42.346982984 +0000 UTC m=+1087.298061997" watchObservedRunningTime="2025-12-16 15:12:42.3499837 +0000 UTC m=+1087.301062633" Dec 16 15:12:43 crc kubenswrapper[4775]: I1216 15:12:43.178620 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-nj5th" event={"ID":"11012716-6e3c-4b17-97c7-16e723ad1092","Type":"ContainerStarted","Data":"6571612f623c3e9cd94f04412024e0c19d85f0ad11c7c4ded1379e40e1f78e0f"} Dec 16 15:12:43 crc kubenswrapper[4775]: I1216 15:12:43.179205 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-nj5th" Dec 16 15:12:43 crc kubenswrapper[4775]: I1216 15:12:43.180507 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-7fmnw" event={"ID":"19d1c138-c230-44b2-972c-c557693054f5","Type":"ContainerStarted","Data":"0943520a166ffffdf924a519fb96b568577b4bdd94f40aaf67149113c6b66547"} Dec 16 15:12:43 crc kubenswrapper[4775]: I1216 15:12:43.204147 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-7fmnw" Dec 16 15:12:43 crc kubenswrapper[4775]: I1216 15:12:43.229657 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-nj5th" podStartSLOduration=6.113695828 podStartE2EDuration="42.22962791s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:05.591241773 +0000 UTC m=+1050.542320696" lastFinishedPulling="2025-12-16 15:12:41.707173855 +0000 UTC m=+1086.658252778" observedRunningTime="2025-12-16 15:12:43.218053989 +0000 UTC m=+1088.169132922" watchObservedRunningTime="2025-12-16 15:12:43.22962791 +0000 UTC m=+1088.180706843" Dec 16 15:12:43 crc kubenswrapper[4775]: I1216 15:12:43.247502 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-7fmnw" podStartSLOduration=5.742492134 podStartE2EDuration="42.247476392s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:05.641936019 +0000 UTC m=+1050.593014942" lastFinishedPulling="2025-12-16 15:12:42.146920277 +0000 UTC m=+1087.097999200" observedRunningTime="2025-12-16 15:12:43.244802556 +0000 UTC m=+1088.195881489" watchObservedRunningTime="2025-12-16 15:12:43.247476392 +0000 UTC m=+1088.198555325" Dec 16 15:12:44 crc kubenswrapper[4775]: I1216 15:12:44.215694 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-d2kbz" event={"ID":"2f9a8b75-2e17-43ce-be88-dbc6f7ec0cb1","Type":"ContainerStarted","Data":"fd41cc8da0cfb7ac657836ec0a368c6c1104b6ae16cbcf3a39cc45f77b322f8d"} Dec 16 15:12:44 crc kubenswrapper[4775]: I1216 15:12:44.216196 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-d2kbz" Dec 16 15:12:44 crc kubenswrapper[4775]: I1216 15:12:44.236249 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-d2kbz" podStartSLOduration=5.380914447 podStartE2EDuration="43.236227041s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:05.211160294 +0000 UTC m=+1050.162239217" lastFinishedPulling="2025-12-16 15:12:43.066472888 +0000 UTC m=+1088.017551811" observedRunningTime="2025-12-16 15:12:44.231401376 +0000 UTC m=+1089.182480299" watchObservedRunningTime="2025-12-16 15:12:44.236227041 +0000 UTC m=+1089.187305964" Dec 16 15:12:44 crc kubenswrapper[4775]: I1216 15:12:44.817155 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7bc9b98d8-rvdbc" Dec 16 15:12:46 crc kubenswrapper[4775]: I1216 15:12:46.231843 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-r8p6v" event={"ID":"85cc53cf-83a7-4810-b0fc-7317f9327c09","Type":"ContainerStarted","Data":"ce6df6808e5969a8d0c659c25d764b7ca6e7743ec0cd48e4bffefbe530df66e7"} Dec 16 15:12:46 crc kubenswrapper[4775]: I1216 15:12:46.233437 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-r8p6v" Dec 16 15:12:46 crc kubenswrapper[4775]: I1216 15:12:46.235419 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5559d9665f-4hmbr" event={"ID":"c5962fcc-3c3b-435a-b848-237af19ce258","Type":"ContainerStarted","Data":"bf8629758ccad22a479f3894147f95aedb24dbd6545eae7d1507bf3e1e64cafa"} Dec 16 15:12:46 crc kubenswrapper[4775]: I1216 15:12:46.235993 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5559d9665f-4hmbr" Dec 16 15:12:46 crc kubenswrapper[4775]: I1216 15:12:46.254953 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-r8p6v" podStartSLOduration=5.126800388 podStartE2EDuration="45.25493548s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:05.918898521 +0000 UTC m=+1050.869977444" lastFinishedPulling="2025-12-16 15:12:46.047033613 +0000 UTC m=+1090.998112536" observedRunningTime="2025-12-16 15:12:46.252488642 +0000 UTC m=+1091.203567565" watchObservedRunningTime="2025-12-16 15:12:46.25493548 +0000 UTC m=+1091.206014403" Dec 16 15:12:46 crc kubenswrapper[4775]: I1216 15:12:46.269452 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5559d9665f-4hmbr" podStartSLOduration=5.340614045 podStartE2EDuration="45.269431845s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:05.640951517 +0000 UTC m=+1050.592030450" lastFinishedPulling="2025-12-16 15:12:45.569769327 +0000 UTC m=+1090.520848250" observedRunningTime="2025-12-16 15:12:46.267380429 +0000 UTC m=+1091.218459382" watchObservedRunningTime="2025-12-16 15:12:46.269431845 +0000 UTC m=+1091.220510768" Dec 16 15:12:46 crc kubenswrapper[4775]: E1216 15:12:46.339310 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-4mncx" podUID="f05c78d5-d86c-42de-9eee-e8d09204a0b4" Dec 16 15:12:47 crc kubenswrapper[4775]: I1216 15:12:47.244371 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-9b8tb" event={"ID":"d8873d69-8f0e-4816-b39e-bf8506282196","Type":"ContainerStarted","Data":"81c55fcfd43811d4cf6b01e4b3c5bd37b5ca47d64a975d2018c8fbc8cb6908bd"} Dec 16 15:12:47 crc kubenswrapper[4775]: I1216 15:12:47.244932 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-9b8tb" Dec 16 15:12:47 crc kubenswrapper[4775]: I1216 15:12:47.246429 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-bl86c" event={"ID":"14102b10-a3ba-4f16-9928-4f41426a435f","Type":"ContainerStarted","Data":"af31642b7feebc70ea8acf25a7b3f8dad2ceb21cd657dafa1b5e0ec9fd794db6"} Dec 16 15:12:47 crc kubenswrapper[4775]: I1216 15:12:47.246612 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-bl86c" Dec 16 15:12:47 crc kubenswrapper[4775]: I1216 15:12:47.259588 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-9b8tb" podStartSLOduration=4.6734231269999995 podStartE2EDuration="46.259570828s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:04.654621755 +0000 UTC m=+1049.605700678" lastFinishedPulling="2025-12-16 15:12:46.240769456 +0000 UTC m=+1091.191848379" observedRunningTime="2025-12-16 15:12:47.258199135 +0000 UTC m=+1092.209278058" watchObservedRunningTime="2025-12-16 15:12:47.259570828 +0000 UTC m=+1092.210649741" Dec 16 15:12:47 crc kubenswrapper[4775]: I1216 15:12:47.280134 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-bl86c" podStartSLOduration=3.873353681 podStartE2EDuration="45.280109957s" podCreationTimestamp="2025-12-16 15:12:02 +0000 UTC" firstStartedPulling="2025-12-16 15:12:05.594108825 +0000 UTC m=+1050.545187748" lastFinishedPulling="2025-12-16 15:12:47.000865091 +0000 UTC m=+1091.951944024" observedRunningTime="2025-12-16 15:12:47.273207106 +0000 UTC m=+1092.224286029" watchObservedRunningTime="2025-12-16 15:12:47.280109957 +0000 UTC m=+1092.231188880" Dec 16 15:12:47 crc kubenswrapper[4775]: I1216 15:12:47.955499 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-6558fdd56c-jc4nj" Dec 16 15:12:48 crc kubenswrapper[4775]: I1216 15:12:48.656124 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb" Dec 16 15:12:49 crc kubenswrapper[4775]: I1216 15:12:49.258866 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-lph76" event={"ID":"eff249fe-7aa9-406b-a4f0-91d7891afc8b","Type":"ContainerStarted","Data":"5c30e6ba416ad348f24c0f0b0ccb078c2259f62ecc2a463c3e25af2794c3aa04"} Dec 16 15:12:49 crc kubenswrapper[4775]: I1216 15:12:49.259131 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-lph76" Dec 16 15:12:49 crc kubenswrapper[4775]: I1216 15:12:49.286501 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-lph76" podStartSLOduration=5.501334058 podStartE2EDuration="48.2864743s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:05.301510421 +0000 UTC m=+1050.252589344" lastFinishedPulling="2025-12-16 15:12:48.086650663 +0000 UTC m=+1093.037729586" observedRunningTime="2025-12-16 15:12:49.2817964 +0000 UTC m=+1094.232875323" watchObservedRunningTime="2025-12-16 15:12:49.2864743 +0000 UTC m=+1094.237553223" Dec 16 15:12:51 crc kubenswrapper[4775]: I1216 15:12:51.937345 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-dfdrn" Dec 16 15:12:51 crc kubenswrapper[4775]: I1216 15:12:51.967951 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-dqpmk" Dec 16 15:12:52 crc kubenswrapper[4775]: I1216 15:12:52.001349 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5559d9665f-4hmbr" Dec 16 15:12:52 crc kubenswrapper[4775]: I1216 15:12:52.025628 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-pk5fg" Dec 16 15:12:52 crc kubenswrapper[4775]: I1216 15:12:52.104012 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-9b8tb" Dec 16 15:12:52 crc kubenswrapper[4775]: I1216 15:12:52.178585 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-gcfkm" Dec 16 15:12:52 crc kubenswrapper[4775]: I1216 15:12:52.338418 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-nj5th" Dec 16 15:12:53 crc kubenswrapper[4775]: I1216 15:12:53.118939 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-d9rg9" Dec 16 15:12:53 crc kubenswrapper[4775]: I1216 15:12:53.205367 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-7fmnw" Dec 16 15:12:53 crc kubenswrapper[4775]: I1216 15:12:53.387424 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-bl86c" Dec 16 15:12:53 crc kubenswrapper[4775]: I1216 15:12:53.403187 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-r8p6v" Dec 16 15:12:53 crc kubenswrapper[4775]: I1216 15:12:53.442719 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-d2kbz" Dec 16 15:13:02 crc kubenswrapper[4775]: I1216 15:13:02.270211 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-lph76" Dec 16 15:13:06 crc kubenswrapper[4775]: I1216 15:13:06.408149 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-nvb99" event={"ID":"63c035e4-8ff2-49a4-94d9-57c65a71494b","Type":"ContainerStarted","Data":"16c36a72f36602ad71b8af8eed4926198eb50aff4983a2b4c8a690ee94d1ecb6"} Dec 16 15:13:06 crc kubenswrapper[4775]: I1216 15:13:06.409870 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-nvb99" Dec 16 15:13:06 crc kubenswrapper[4775]: I1216 15:13:06.411334 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xd48q" event={"ID":"d132ccba-b1e9-4f8c-8129-1087a1a672b9","Type":"ContainerStarted","Data":"feffeda08cfb01623a29f936440722b868142c1053756bfd51fae794d7acf3b9"} Dec 16 15:13:06 crc kubenswrapper[4775]: I1216 15:13:06.432281 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-nvb99" podStartSLOduration=5.212784792 podStartE2EDuration="1m5.43226388s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:05.603432883 +0000 UTC m=+1050.554511806" lastFinishedPulling="2025-12-16 15:13:05.822911971 +0000 UTC m=+1110.773990894" observedRunningTime="2025-12-16 15:13:06.428352856 +0000 UTC m=+1111.379431789" watchObservedRunningTime="2025-12-16 15:13:06.43226388 +0000 UTC m=+1111.383342803" Dec 16 15:13:06 crc kubenswrapper[4775]: I1216 15:13:06.445604 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xd48q" podStartSLOduration=4.224043566 podStartE2EDuration="1m4.445587159s" podCreationTimestamp="2025-12-16 15:12:02 +0000 UTC" firstStartedPulling="2025-12-16 15:12:05.604115135 +0000 UTC m=+1050.555194058" lastFinishedPulling="2025-12-16 15:13:05.825658728 +0000 UTC m=+1110.776737651" observedRunningTime="2025-12-16 15:13:06.442958606 +0000 UTC m=+1111.394037529" watchObservedRunningTime="2025-12-16 15:13:06.445587159 +0000 UTC m=+1111.396666082" Dec 16 15:13:07 crc kubenswrapper[4775]: I1216 15:13:07.420089 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-4mncx" event={"ID":"f05c78d5-d86c-42de-9eee-e8d09204a0b4","Type":"ContainerStarted","Data":"2588a17ff8aa67e9731e66b60e111219c9b7d94badbcbae604e7048a3db24c6f"} Dec 16 15:13:07 crc kubenswrapper[4775]: I1216 15:13:07.420321 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-4mncx" Dec 16 15:13:07 crc kubenswrapper[4775]: I1216 15:13:07.439949 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-4mncx" podStartSLOduration=5.953816286 podStartE2EDuration="1m6.439932851s" podCreationTimestamp="2025-12-16 15:12:01 +0000 UTC" firstStartedPulling="2025-12-16 15:12:05.963259743 +0000 UTC m=+1050.914338666" lastFinishedPulling="2025-12-16 15:13:06.449376308 +0000 UTC m=+1111.400455231" observedRunningTime="2025-12-16 15:13:07.435592594 +0000 UTC m=+1112.386671527" watchObservedRunningTime="2025-12-16 15:13:07.439932851 +0000 UTC m=+1112.391011784" Dec 16 15:13:12 crc kubenswrapper[4775]: I1216 15:13:12.321940 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-nvb99" Dec 16 15:13:13 crc kubenswrapper[4775]: I1216 15:13:13.119352 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-4mncx" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.286858 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-7npzr"] Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.290511 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-7npzr" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.295221 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.295560 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.295957 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-cwhbf" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.299181 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.304601 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-7npzr"] Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.364532 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-n4jdv"] Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.366318 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-n4jdv" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.369057 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.380400 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-n4jdv"] Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.386750 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zdlw\" (UniqueName: \"kubernetes.io/projected/6565fe67-07cb-4f93-aefd-a8ae5622b79a-kube-api-access-4zdlw\") pod \"dnsmasq-dns-675f4bcbfc-7npzr\" (UID: \"6565fe67-07cb-4f93-aefd-a8ae5622b79a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-7npzr" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.386786 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36fe89c9-3522-47e3-bd98-9a75fb410abd-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-n4jdv\" (UID: \"36fe89c9-3522-47e3-bd98-9a75fb410abd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-n4jdv" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.386808 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36fe89c9-3522-47e3-bd98-9a75fb410abd-config\") pod \"dnsmasq-dns-78dd6ddcc-n4jdv\" (UID: \"36fe89c9-3522-47e3-bd98-9a75fb410abd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-n4jdv" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.386843 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6565fe67-07cb-4f93-aefd-a8ae5622b79a-config\") pod \"dnsmasq-dns-675f4bcbfc-7npzr\" (UID: \"6565fe67-07cb-4f93-aefd-a8ae5622b79a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-7npzr" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.386873 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhq9j\" (UniqueName: \"kubernetes.io/projected/36fe89c9-3522-47e3-bd98-9a75fb410abd-kube-api-access-xhq9j\") pod \"dnsmasq-dns-78dd6ddcc-n4jdv\" (UID: \"36fe89c9-3522-47e3-bd98-9a75fb410abd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-n4jdv" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.488448 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6565fe67-07cb-4f93-aefd-a8ae5622b79a-config\") pod \"dnsmasq-dns-675f4bcbfc-7npzr\" (UID: \"6565fe67-07cb-4f93-aefd-a8ae5622b79a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-7npzr" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.488504 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhq9j\" (UniqueName: \"kubernetes.io/projected/36fe89c9-3522-47e3-bd98-9a75fb410abd-kube-api-access-xhq9j\") pod \"dnsmasq-dns-78dd6ddcc-n4jdv\" (UID: \"36fe89c9-3522-47e3-bd98-9a75fb410abd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-n4jdv" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.488564 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zdlw\" (UniqueName: \"kubernetes.io/projected/6565fe67-07cb-4f93-aefd-a8ae5622b79a-kube-api-access-4zdlw\") pod \"dnsmasq-dns-675f4bcbfc-7npzr\" (UID: \"6565fe67-07cb-4f93-aefd-a8ae5622b79a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-7npzr" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.488583 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36fe89c9-3522-47e3-bd98-9a75fb410abd-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-n4jdv\" (UID: \"36fe89c9-3522-47e3-bd98-9a75fb410abd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-n4jdv" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.488604 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36fe89c9-3522-47e3-bd98-9a75fb410abd-config\") pod \"dnsmasq-dns-78dd6ddcc-n4jdv\" (UID: \"36fe89c9-3522-47e3-bd98-9a75fb410abd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-n4jdv" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.489488 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36fe89c9-3522-47e3-bd98-9a75fb410abd-config\") pod \"dnsmasq-dns-78dd6ddcc-n4jdv\" (UID: \"36fe89c9-3522-47e3-bd98-9a75fb410abd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-n4jdv" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.489816 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6565fe67-07cb-4f93-aefd-a8ae5622b79a-config\") pod \"dnsmasq-dns-675f4bcbfc-7npzr\" (UID: \"6565fe67-07cb-4f93-aefd-a8ae5622b79a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-7npzr" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.489901 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36fe89c9-3522-47e3-bd98-9a75fb410abd-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-n4jdv\" (UID: \"36fe89c9-3522-47e3-bd98-9a75fb410abd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-n4jdv" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.507973 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhq9j\" (UniqueName: \"kubernetes.io/projected/36fe89c9-3522-47e3-bd98-9a75fb410abd-kube-api-access-xhq9j\") pod \"dnsmasq-dns-78dd6ddcc-n4jdv\" (UID: \"36fe89c9-3522-47e3-bd98-9a75fb410abd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-n4jdv" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.515289 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zdlw\" (UniqueName: \"kubernetes.io/projected/6565fe67-07cb-4f93-aefd-a8ae5622b79a-kube-api-access-4zdlw\") pod \"dnsmasq-dns-675f4bcbfc-7npzr\" (UID: \"6565fe67-07cb-4f93-aefd-a8ae5622b79a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-7npzr" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.609794 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-7npzr" Dec 16 15:13:27 crc kubenswrapper[4775]: I1216 15:13:27.682866 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-n4jdv" Dec 16 15:13:28 crc kubenswrapper[4775]: I1216 15:13:28.108144 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-7npzr"] Dec 16 15:13:28 crc kubenswrapper[4775]: W1216 15:13:28.115967 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6565fe67_07cb_4f93_aefd_a8ae5622b79a.slice/crio-5b9149d39901a66788b326d84649e0084b650b56d4697d82cd38d1e7fc512b0f WatchSource:0}: Error finding container 5b9149d39901a66788b326d84649e0084b650b56d4697d82cd38d1e7fc512b0f: Status 404 returned error can't find the container with id 5b9149d39901a66788b326d84649e0084b650b56d4697d82cd38d1e7fc512b0f Dec 16 15:13:28 crc kubenswrapper[4775]: I1216 15:13:28.195955 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-n4jdv"] Dec 16 15:13:28 crc kubenswrapper[4775]: W1216 15:13:28.200179 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36fe89c9_3522_47e3_bd98_9a75fb410abd.slice/crio-9d140db3502c578beeb7a87d60b61f0499b3d772286b1c0922f8ea39289852fc WatchSource:0}: Error finding container 9d140db3502c578beeb7a87d60b61f0499b3d772286b1c0922f8ea39289852fc: Status 404 returned error can't find the container with id 9d140db3502c578beeb7a87d60b61f0499b3d772286b1c0922f8ea39289852fc Dec 16 15:13:28 crc kubenswrapper[4775]: I1216 15:13:28.566602 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-n4jdv" event={"ID":"36fe89c9-3522-47e3-bd98-9a75fb410abd","Type":"ContainerStarted","Data":"9d140db3502c578beeb7a87d60b61f0499b3d772286b1c0922f8ea39289852fc"} Dec 16 15:13:28 crc kubenswrapper[4775]: I1216 15:13:28.567708 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-7npzr" event={"ID":"6565fe67-07cb-4f93-aefd-a8ae5622b79a","Type":"ContainerStarted","Data":"5b9149d39901a66788b326d84649e0084b650b56d4697d82cd38d1e7fc512b0f"} Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.505410 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-7npzr"] Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.526583 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jdj4n"] Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.527875 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.551445 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jdj4n"] Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.667434 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43c5e083-39ac-488d-a68f-699ba4b264cb-config\") pod \"dnsmasq-dns-666b6646f7-jdj4n\" (UID: \"43c5e083-39ac-488d-a68f-699ba4b264cb\") " pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.667522 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43c5e083-39ac-488d-a68f-699ba4b264cb-dns-svc\") pod \"dnsmasq-dns-666b6646f7-jdj4n\" (UID: \"43c5e083-39ac-488d-a68f-699ba4b264cb\") " pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.667586 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czpqk\" (UniqueName: \"kubernetes.io/projected/43c5e083-39ac-488d-a68f-699ba4b264cb-kube-api-access-czpqk\") pod \"dnsmasq-dns-666b6646f7-jdj4n\" (UID: \"43c5e083-39ac-488d-a68f-699ba4b264cb\") " pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.770628 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43c5e083-39ac-488d-a68f-699ba4b264cb-dns-svc\") pod \"dnsmasq-dns-666b6646f7-jdj4n\" (UID: \"43c5e083-39ac-488d-a68f-699ba4b264cb\") " pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.770724 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czpqk\" (UniqueName: \"kubernetes.io/projected/43c5e083-39ac-488d-a68f-699ba4b264cb-kube-api-access-czpqk\") pod \"dnsmasq-dns-666b6646f7-jdj4n\" (UID: \"43c5e083-39ac-488d-a68f-699ba4b264cb\") " pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.770758 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43c5e083-39ac-488d-a68f-699ba4b264cb-config\") pod \"dnsmasq-dns-666b6646f7-jdj4n\" (UID: \"43c5e083-39ac-488d-a68f-699ba4b264cb\") " pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.771677 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43c5e083-39ac-488d-a68f-699ba4b264cb-dns-svc\") pod \"dnsmasq-dns-666b6646f7-jdj4n\" (UID: \"43c5e083-39ac-488d-a68f-699ba4b264cb\") " pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.772354 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43c5e083-39ac-488d-a68f-699ba4b264cb-config\") pod \"dnsmasq-dns-666b6646f7-jdj4n\" (UID: \"43c5e083-39ac-488d-a68f-699ba4b264cb\") " pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.828960 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czpqk\" (UniqueName: \"kubernetes.io/projected/43c5e083-39ac-488d-a68f-699ba4b264cb-kube-api-access-czpqk\") pod \"dnsmasq-dns-666b6646f7-jdj4n\" (UID: \"43c5e083-39ac-488d-a68f-699ba4b264cb\") " pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.841250 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-n4jdv"] Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.867187 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s9zj2"] Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.868506 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.868658 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.875380 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ssbl\" (UniqueName: \"kubernetes.io/projected/06fb4931-8386-4c6f-86c6-2cb5c0a323f0-kube-api-access-9ssbl\") pod \"dnsmasq-dns-57d769cc4f-s9zj2\" (UID: \"06fb4931-8386-4c6f-86c6-2cb5c0a323f0\") " pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.875464 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06fb4931-8386-4c6f-86c6-2cb5c0a323f0-config\") pod \"dnsmasq-dns-57d769cc4f-s9zj2\" (UID: \"06fb4931-8386-4c6f-86c6-2cb5c0a323f0\") " pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.875526 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06fb4931-8386-4c6f-86c6-2cb5c0a323f0-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-s9zj2\" (UID: \"06fb4931-8386-4c6f-86c6-2cb5c0a323f0\") " pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.877538 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s9zj2"] Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.977399 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ssbl\" (UniqueName: \"kubernetes.io/projected/06fb4931-8386-4c6f-86c6-2cb5c0a323f0-kube-api-access-9ssbl\") pod \"dnsmasq-dns-57d769cc4f-s9zj2\" (UID: \"06fb4931-8386-4c6f-86c6-2cb5c0a323f0\") " pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.977473 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06fb4931-8386-4c6f-86c6-2cb5c0a323f0-config\") pod \"dnsmasq-dns-57d769cc4f-s9zj2\" (UID: \"06fb4931-8386-4c6f-86c6-2cb5c0a323f0\") " pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.977531 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06fb4931-8386-4c6f-86c6-2cb5c0a323f0-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-s9zj2\" (UID: \"06fb4931-8386-4c6f-86c6-2cb5c0a323f0\") " pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.978469 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06fb4931-8386-4c6f-86c6-2cb5c0a323f0-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-s9zj2\" (UID: \"06fb4931-8386-4c6f-86c6-2cb5c0a323f0\") " pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" Dec 16 15:13:30 crc kubenswrapper[4775]: I1216 15:13:30.979565 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06fb4931-8386-4c6f-86c6-2cb5c0a323f0-config\") pod \"dnsmasq-dns-57d769cc4f-s9zj2\" (UID: \"06fb4931-8386-4c6f-86c6-2cb5c0a323f0\") " pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.024040 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ssbl\" (UniqueName: \"kubernetes.io/projected/06fb4931-8386-4c6f-86c6-2cb5c0a323f0-kube-api-access-9ssbl\") pod \"dnsmasq-dns-57d769cc4f-s9zj2\" (UID: \"06fb4931-8386-4c6f-86c6-2cb5c0a323f0\") " pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.187654 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.442656 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jdj4n"] Dec 16 15:13:31 crc kubenswrapper[4775]: W1216 15:13:31.457324 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43c5e083_39ac_488d_a68f_699ba4b264cb.slice/crio-ba6e92140b8ffd6a356e09b2c378defebe39f8fb6ca25745859a6e8390d03789 WatchSource:0}: Error finding container ba6e92140b8ffd6a356e09b2c378defebe39f8fb6ca25745859a6e8390d03789: Status 404 returned error can't find the container with id ba6e92140b8ffd6a356e09b2c378defebe39f8fb6ca25745859a6e8390d03789 Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.595190 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" event={"ID":"43c5e083-39ac-488d-a68f-699ba4b264cb","Type":"ContainerStarted","Data":"ba6e92140b8ffd6a356e09b2c378defebe39f8fb6ca25745859a6e8390d03789"} Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.671202 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.672985 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.677094 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.677304 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.677464 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-244wz" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.677669 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.677672 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.677320 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.679855 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.680502 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.696530 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s9zj2"] Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.786329 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gfvw\" (UniqueName: \"kubernetes.io/projected/79fbce0a-9f2b-4548-b886-de6dfe5ff245-kube-api-access-2gfvw\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.786373 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/79fbce0a-9f2b-4548-b886-de6dfe5ff245-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.786410 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.786563 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.786772 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.786900 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79fbce0a-9f2b-4548-b886-de6dfe5ff245-config-data\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.786943 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.787070 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/79fbce0a-9f2b-4548-b886-de6dfe5ff245-server-conf\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.787171 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.787237 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/79fbce0a-9f2b-4548-b886-de6dfe5ff245-pod-info\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.787390 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/79fbce0a-9f2b-4548-b886-de6dfe5ff245-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.888845 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.888916 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/79fbce0a-9f2b-4548-b886-de6dfe5ff245-pod-info\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.888976 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/79fbce0a-9f2b-4548-b886-de6dfe5ff245-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.889011 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gfvw\" (UniqueName: \"kubernetes.io/projected/79fbce0a-9f2b-4548-b886-de6dfe5ff245-kube-api-access-2gfvw\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.889038 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/79fbce0a-9f2b-4548-b886-de6dfe5ff245-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.889082 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.889109 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.889150 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.889377 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79fbce0a-9f2b-4548-b886-de6dfe5ff245-config-data\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.889403 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.889439 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/79fbce0a-9f2b-4548-b886-de6dfe5ff245-server-conf\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.890559 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.891599 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/79fbce0a-9f2b-4548-b886-de6dfe5ff245-server-conf\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.891876 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.891966 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.896107 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/79fbce0a-9f2b-4548-b886-de6dfe5ff245-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.896296 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/79fbce0a-9f2b-4548-b886-de6dfe5ff245-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.896535 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/79fbce0a-9f2b-4548-b886-de6dfe5ff245-pod-info\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.896973 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.896996 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.905827 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79fbce0a-9f2b-4548-b886-de6dfe5ff245-config-data\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.912830 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:31 crc kubenswrapper[4775]: I1216 15:13:31.913273 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gfvw\" (UniqueName: \"kubernetes.io/projected/79fbce0a-9f2b-4548-b886-de6dfe5ff245-kube-api-access-2gfvw\") pod \"rabbitmq-server-0\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " pod="openstack/rabbitmq-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.001369 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.014804 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.017226 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.019738 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-qtj5d" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.020100 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.020365 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.020555 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.020802 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.021625 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.021797 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.024653 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.198953 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0451a266-fe64-4e36-93f7-9ebb1e547eec-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.199014 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.199051 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0451a266-fe64-4e36-93f7-9ebb1e547eec-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.199082 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.199109 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0451a266-fe64-4e36-93f7-9ebb1e547eec-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.199128 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.199155 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0451a266-fe64-4e36-93f7-9ebb1e547eec-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.199187 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0451a266-fe64-4e36-93f7-9ebb1e547eec-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.199237 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.199274 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.199295 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6ksm\" (UniqueName: \"kubernetes.io/projected/0451a266-fe64-4e36-93f7-9ebb1e547eec-kube-api-access-f6ksm\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.300471 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0451a266-fe64-4e36-93f7-9ebb1e547eec-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.300524 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.300556 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0451a266-fe64-4e36-93f7-9ebb1e547eec-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.300584 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.300612 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0451a266-fe64-4e36-93f7-9ebb1e547eec-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.300631 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.300655 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0451a266-fe64-4e36-93f7-9ebb1e547eec-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.300693 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0451a266-fe64-4e36-93f7-9ebb1e547eec-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.300746 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.300783 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.300806 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6ksm\" (UniqueName: \"kubernetes.io/projected/0451a266-fe64-4e36-93f7-9ebb1e547eec-kube-api-access-f6ksm\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.302905 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.302986 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0451a266-fe64-4e36-93f7-9ebb1e547eec-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.303540 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0451a266-fe64-4e36-93f7-9ebb1e547eec-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.303931 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.304479 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.305081 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0451a266-fe64-4e36-93f7-9ebb1e547eec-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.306468 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.308758 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0451a266-fe64-4e36-93f7-9ebb1e547eec-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.320827 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.326764 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6ksm\" (UniqueName: \"kubernetes.io/projected/0451a266-fe64-4e36-93f7-9ebb1e547eec-kube-api-access-f6ksm\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.332768 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0451a266-fe64-4e36-93f7-9ebb1e547eec-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.342912 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:32 crc kubenswrapper[4775]: I1216 15:13:32.348674 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.449505 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.450686 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.452526 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-g4j2b" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.453077 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.453274 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.453229 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.502907 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.522723 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.628286 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2baed48f-c5f4-4126-b0ed-403a38b18c00-kolla-config\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.628622 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2baed48f-c5f4-4126-b0ed-403a38b18c00-config-data-default\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.628645 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2baed48f-c5f4-4126-b0ed-403a38b18c00-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.628683 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2baed48f-c5f4-4126-b0ed-403a38b18c00-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.628702 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.628737 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2baed48f-c5f4-4126-b0ed-403a38b18c00-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.628774 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4jxf\" (UniqueName: \"kubernetes.io/projected/2baed48f-c5f4-4126-b0ed-403a38b18c00-kube-api-access-s4jxf\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.628790 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2baed48f-c5f4-4126-b0ed-403a38b18c00-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.729966 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2baed48f-c5f4-4126-b0ed-403a38b18c00-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.730026 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.730078 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2baed48f-c5f4-4126-b0ed-403a38b18c00-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.730130 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4jxf\" (UniqueName: \"kubernetes.io/projected/2baed48f-c5f4-4126-b0ed-403a38b18c00-kube-api-access-s4jxf\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.730156 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2baed48f-c5f4-4126-b0ed-403a38b18c00-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.730213 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2baed48f-c5f4-4126-b0ed-403a38b18c00-kolla-config\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.730235 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2baed48f-c5f4-4126-b0ed-403a38b18c00-config-data-default\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.730269 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2baed48f-c5f4-4126-b0ed-403a38b18c00-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.730503 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.730814 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2baed48f-c5f4-4126-b0ed-403a38b18c00-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.731354 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2baed48f-c5f4-4126-b0ed-403a38b18c00-config-data-default\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.731391 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2baed48f-c5f4-4126-b0ed-403a38b18c00-kolla-config\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.732235 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2baed48f-c5f4-4126-b0ed-403a38b18c00-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.738489 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2baed48f-c5f4-4126-b0ed-403a38b18c00-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.742595 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2baed48f-c5f4-4126-b0ed-403a38b18c00-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.752506 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4jxf\" (UniqueName: \"kubernetes.io/projected/2baed48f-c5f4-4126-b0ed-403a38b18c00-kube-api-access-s4jxf\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.756184 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"2baed48f-c5f4-4126-b0ed-403a38b18c00\") " pod="openstack/openstack-galera-0" Dec 16 15:13:33 crc kubenswrapper[4775]: I1216 15:13:33.811880 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.629941 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" event={"ID":"06fb4931-8386-4c6f-86c6-2cb5c0a323f0","Type":"ContainerStarted","Data":"0d0603c7b3e92f7d6a1f3267dd76a4ea243eadf37770d89126b2436054d0f139"} Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.631179 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.633701 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.639794 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.640142 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.640170 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-fw92g" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.640509 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.648594 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.749979 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e109503b-1619-4659-956c-24c58c0011a6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.750063 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e109503b-1619-4659-956c-24c58c0011a6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.750086 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e109503b-1619-4659-956c-24c58c0011a6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.750123 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e109503b-1619-4659-956c-24c58c0011a6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.750172 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.750195 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e109503b-1619-4659-956c-24c58c0011a6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.750214 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e109503b-1619-4659-956c-24c58c0011a6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.750268 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6qmr\" (UniqueName: \"kubernetes.io/projected/e109503b-1619-4659-956c-24c58c0011a6-kube-api-access-h6qmr\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.851756 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e109503b-1619-4659-956c-24c58c0011a6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.852238 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.852270 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e109503b-1619-4659-956c-24c58c0011a6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.852309 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e109503b-1619-4659-956c-24c58c0011a6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.852377 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6qmr\" (UniqueName: \"kubernetes.io/projected/e109503b-1619-4659-956c-24c58c0011a6-kube-api-access-h6qmr\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.852406 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e109503b-1619-4659-956c-24c58c0011a6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.852457 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e109503b-1619-4659-956c-24c58c0011a6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.852495 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e109503b-1619-4659-956c-24c58c0011a6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.853470 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e109503b-1619-4659-956c-24c58c0011a6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.854758 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e109503b-1619-4659-956c-24c58c0011a6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.855197 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.856587 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e109503b-1619-4659-956c-24c58c0011a6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.857514 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e109503b-1619-4659-956c-24c58c0011a6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.859565 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e109503b-1619-4659-956c-24c58c0011a6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.874699 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e109503b-1619-4659-956c-24c58c0011a6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.903569 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6qmr\" (UniqueName: \"kubernetes.io/projected/e109503b-1619-4659-956c-24c58c0011a6-kube-api-access-h6qmr\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.929062 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e109503b-1619-4659-956c-24c58c0011a6\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:34 crc kubenswrapper[4775]: I1216 15:13:34.972391 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.182899 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.184567 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.188518 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.189000 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.189039 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-vrcrc" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.216602 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.260379 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8bdb272-4c39-4532-926a-f3dcc70af374-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f8bdb272-4c39-4532-926a-f3dcc70af374\") " pod="openstack/memcached-0" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.260438 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8bdb272-4c39-4532-926a-f3dcc70af374-config-data\") pod \"memcached-0\" (UID: \"f8bdb272-4c39-4532-926a-f3dcc70af374\") " pod="openstack/memcached-0" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.260460 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bdb272-4c39-4532-926a-f3dcc70af374-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f8bdb272-4c39-4532-926a-f3dcc70af374\") " pod="openstack/memcached-0" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.260560 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f8bdb272-4c39-4532-926a-f3dcc70af374-kolla-config\") pod \"memcached-0\" (UID: \"f8bdb272-4c39-4532-926a-f3dcc70af374\") " pod="openstack/memcached-0" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.260617 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d7sk\" (UniqueName: \"kubernetes.io/projected/f8bdb272-4c39-4532-926a-f3dcc70af374-kube-api-access-4d7sk\") pod \"memcached-0\" (UID: \"f8bdb272-4c39-4532-926a-f3dcc70af374\") " pod="openstack/memcached-0" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.361925 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8bdb272-4c39-4532-926a-f3dcc70af374-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f8bdb272-4c39-4532-926a-f3dcc70af374\") " pod="openstack/memcached-0" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.361995 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8bdb272-4c39-4532-926a-f3dcc70af374-config-data\") pod \"memcached-0\" (UID: \"f8bdb272-4c39-4532-926a-f3dcc70af374\") " pod="openstack/memcached-0" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.362024 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bdb272-4c39-4532-926a-f3dcc70af374-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f8bdb272-4c39-4532-926a-f3dcc70af374\") " pod="openstack/memcached-0" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.362077 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f8bdb272-4c39-4532-926a-f3dcc70af374-kolla-config\") pod \"memcached-0\" (UID: \"f8bdb272-4c39-4532-926a-f3dcc70af374\") " pod="openstack/memcached-0" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.362116 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d7sk\" (UniqueName: \"kubernetes.io/projected/f8bdb272-4c39-4532-926a-f3dcc70af374-kube-api-access-4d7sk\") pod \"memcached-0\" (UID: \"f8bdb272-4c39-4532-926a-f3dcc70af374\") " pod="openstack/memcached-0" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.364232 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.364345 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.379501 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bdb272-4c39-4532-926a-f3dcc70af374-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f8bdb272-4c39-4532-926a-f3dcc70af374\") " pod="openstack/memcached-0" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.381178 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f8bdb272-4c39-4532-926a-f3dcc70af374-kolla-config\") pod \"memcached-0\" (UID: \"f8bdb272-4c39-4532-926a-f3dcc70af374\") " pod="openstack/memcached-0" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.381368 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8bdb272-4c39-4532-926a-f3dcc70af374-config-data\") pod \"memcached-0\" (UID: \"f8bdb272-4c39-4532-926a-f3dcc70af374\") " pod="openstack/memcached-0" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.382797 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8bdb272-4c39-4532-926a-f3dcc70af374-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f8bdb272-4c39-4532-926a-f3dcc70af374\") " pod="openstack/memcached-0" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.383630 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d7sk\" (UniqueName: \"kubernetes.io/projected/f8bdb272-4c39-4532-926a-f3dcc70af374-kube-api-access-4d7sk\") pod \"memcached-0\" (UID: \"f8bdb272-4c39-4532-926a-f3dcc70af374\") " pod="openstack/memcached-0" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.507993 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-vrcrc" Dec 16 15:13:35 crc kubenswrapper[4775]: I1216 15:13:35.516725 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 15:13:36 crc kubenswrapper[4775]: I1216 15:13:36.877208 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 15:13:36 crc kubenswrapper[4775]: I1216 15:13:36.878748 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 15:13:36 crc kubenswrapper[4775]: I1216 15:13:36.882812 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-846p5" Dec 16 15:13:37 crc kubenswrapper[4775]: I1216 15:13:36.890263 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 15:13:37 crc kubenswrapper[4775]: I1216 15:13:37.005396 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmdft\" (UniqueName: \"kubernetes.io/projected/06cbbbf9-ba64-4343-bde4-61db8b81e2d8-kube-api-access-jmdft\") pod \"kube-state-metrics-0\" (UID: \"06cbbbf9-ba64-4343-bde4-61db8b81e2d8\") " pod="openstack/kube-state-metrics-0" Dec 16 15:13:37 crc kubenswrapper[4775]: I1216 15:13:37.106438 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmdft\" (UniqueName: \"kubernetes.io/projected/06cbbbf9-ba64-4343-bde4-61db8b81e2d8-kube-api-access-jmdft\") pod \"kube-state-metrics-0\" (UID: \"06cbbbf9-ba64-4343-bde4-61db8b81e2d8\") " pod="openstack/kube-state-metrics-0" Dec 16 15:13:37 crc kubenswrapper[4775]: I1216 15:13:37.127931 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmdft\" (UniqueName: \"kubernetes.io/projected/06cbbbf9-ba64-4343-bde4-61db8b81e2d8-kube-api-access-jmdft\") pod \"kube-state-metrics-0\" (UID: \"06cbbbf9-ba64-4343-bde4-61db8b81e2d8\") " pod="openstack/kube-state-metrics-0" Dec 16 15:13:37 crc kubenswrapper[4775]: I1216 15:13:37.316949 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.118591 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rkmmt"] Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.120192 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.123730 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-t88d4" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.123791 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.123997 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.153036 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-c5f9m"] Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.155177 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.155588 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-c5f9m"] Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.191033 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rkmmt"] Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.280748 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a42d9c48-0f56-4f2d-8c54-8baebeca09ea-var-log\") pod \"ovn-controller-ovs-c5f9m\" (UID: \"a42d9c48-0f56-4f2d-8c54-8baebeca09ea\") " pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.280942 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b560f177-aa8d-4722-92bd-4ef2755caab0-combined-ca-bundle\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.281028 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9tht\" (UniqueName: \"kubernetes.io/projected/b560f177-aa8d-4722-92bd-4ef2755caab0-kube-api-access-j9tht\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.281099 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a42d9c48-0f56-4f2d-8c54-8baebeca09ea-scripts\") pod \"ovn-controller-ovs-c5f9m\" (UID: \"a42d9c48-0f56-4f2d-8c54-8baebeca09ea\") " pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.281130 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b560f177-aa8d-4722-92bd-4ef2755caab0-scripts\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.281158 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a42d9c48-0f56-4f2d-8c54-8baebeca09ea-etc-ovs\") pod \"ovn-controller-ovs-c5f9m\" (UID: \"a42d9c48-0f56-4f2d-8c54-8baebeca09ea\") " pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.281185 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a42d9c48-0f56-4f2d-8c54-8baebeca09ea-var-lib\") pod \"ovn-controller-ovs-c5f9m\" (UID: \"a42d9c48-0f56-4f2d-8c54-8baebeca09ea\") " pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.281202 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b560f177-aa8d-4722-92bd-4ef2755caab0-ovn-controller-tls-certs\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.281250 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b560f177-aa8d-4722-92bd-4ef2755caab0-var-run\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.281272 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b560f177-aa8d-4722-92bd-4ef2755caab0-var-log-ovn\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.281496 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b560f177-aa8d-4722-92bd-4ef2755caab0-var-run-ovn\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.281544 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a42d9c48-0f56-4f2d-8c54-8baebeca09ea-var-run\") pod \"ovn-controller-ovs-c5f9m\" (UID: \"a42d9c48-0f56-4f2d-8c54-8baebeca09ea\") " pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.281594 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vkd9\" (UniqueName: \"kubernetes.io/projected/a42d9c48-0f56-4f2d-8c54-8baebeca09ea-kube-api-access-5vkd9\") pod \"ovn-controller-ovs-c5f9m\" (UID: \"a42d9c48-0f56-4f2d-8c54-8baebeca09ea\") " pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.383993 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b560f177-aa8d-4722-92bd-4ef2755caab0-combined-ca-bundle\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.384087 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9tht\" (UniqueName: \"kubernetes.io/projected/b560f177-aa8d-4722-92bd-4ef2755caab0-kube-api-access-j9tht\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.384133 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a42d9c48-0f56-4f2d-8c54-8baebeca09ea-etc-ovs\") pod \"ovn-controller-ovs-c5f9m\" (UID: \"a42d9c48-0f56-4f2d-8c54-8baebeca09ea\") " pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.384156 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a42d9c48-0f56-4f2d-8c54-8baebeca09ea-scripts\") pod \"ovn-controller-ovs-c5f9m\" (UID: \"a42d9c48-0f56-4f2d-8c54-8baebeca09ea\") " pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.384174 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b560f177-aa8d-4722-92bd-4ef2755caab0-scripts\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.384203 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a42d9c48-0f56-4f2d-8c54-8baebeca09ea-var-lib\") pod \"ovn-controller-ovs-c5f9m\" (UID: \"a42d9c48-0f56-4f2d-8c54-8baebeca09ea\") " pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.384223 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b560f177-aa8d-4722-92bd-4ef2755caab0-ovn-controller-tls-certs\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.384262 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b560f177-aa8d-4722-92bd-4ef2755caab0-var-run\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.384291 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b560f177-aa8d-4722-92bd-4ef2755caab0-var-log-ovn\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.384360 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b560f177-aa8d-4722-92bd-4ef2755caab0-var-run-ovn\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.384387 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a42d9c48-0f56-4f2d-8c54-8baebeca09ea-var-run\") pod \"ovn-controller-ovs-c5f9m\" (UID: \"a42d9c48-0f56-4f2d-8c54-8baebeca09ea\") " pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.384423 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vkd9\" (UniqueName: \"kubernetes.io/projected/a42d9c48-0f56-4f2d-8c54-8baebeca09ea-kube-api-access-5vkd9\") pod \"ovn-controller-ovs-c5f9m\" (UID: \"a42d9c48-0f56-4f2d-8c54-8baebeca09ea\") " pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.384450 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a42d9c48-0f56-4f2d-8c54-8baebeca09ea-var-log\") pod \"ovn-controller-ovs-c5f9m\" (UID: \"a42d9c48-0f56-4f2d-8c54-8baebeca09ea\") " pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.387655 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a42d9c48-0f56-4f2d-8c54-8baebeca09ea-scripts\") pod \"ovn-controller-ovs-c5f9m\" (UID: \"a42d9c48-0f56-4f2d-8c54-8baebeca09ea\") " pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.390851 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b560f177-aa8d-4722-92bd-4ef2755caab0-ovn-controller-tls-certs\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.391100 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b560f177-aa8d-4722-92bd-4ef2755caab0-scripts\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.392210 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a42d9c48-0f56-4f2d-8c54-8baebeca09ea-etc-ovs\") pod \"ovn-controller-ovs-c5f9m\" (UID: \"a42d9c48-0f56-4f2d-8c54-8baebeca09ea\") " pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.392258 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a42d9c48-0f56-4f2d-8c54-8baebeca09ea-var-log\") pod \"ovn-controller-ovs-c5f9m\" (UID: \"a42d9c48-0f56-4f2d-8c54-8baebeca09ea\") " pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.392421 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a42d9c48-0f56-4f2d-8c54-8baebeca09ea-var-lib\") pod \"ovn-controller-ovs-c5f9m\" (UID: \"a42d9c48-0f56-4f2d-8c54-8baebeca09ea\") " pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.392659 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a42d9c48-0f56-4f2d-8c54-8baebeca09ea-var-run\") pod \"ovn-controller-ovs-c5f9m\" (UID: \"a42d9c48-0f56-4f2d-8c54-8baebeca09ea\") " pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.392680 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b560f177-aa8d-4722-92bd-4ef2755caab0-var-run-ovn\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.393184 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b560f177-aa8d-4722-92bd-4ef2755caab0-var-run\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.393384 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b560f177-aa8d-4722-92bd-4ef2755caab0-var-log-ovn\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.395178 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b560f177-aa8d-4722-92bd-4ef2755caab0-combined-ca-bundle\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.405441 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9tht\" (UniqueName: \"kubernetes.io/projected/b560f177-aa8d-4722-92bd-4ef2755caab0-kube-api-access-j9tht\") pod \"ovn-controller-rkmmt\" (UID: \"b560f177-aa8d-4722-92bd-4ef2755caab0\") " pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.409392 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vkd9\" (UniqueName: \"kubernetes.io/projected/a42d9c48-0f56-4f2d-8c54-8baebeca09ea-kube-api-access-5vkd9\") pod \"ovn-controller-ovs-c5f9m\" (UID: \"a42d9c48-0f56-4f2d-8c54-8baebeca09ea\") " pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.448492 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:41 crc kubenswrapper[4775]: I1216 15:13:41.488157 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.577079 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.579129 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.584711 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.584996 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.585056 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-j5xdx" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.585170 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.585265 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.587967 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.725455 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.725511 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.725534 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-config\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.725553 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.725579 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.725637 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.725657 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdpdb\" (UniqueName: \"kubernetes.io/projected/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-kube-api-access-tdpdb\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.725690 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.772657 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.774736 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.777566 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.777611 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.777607 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-9z2fq" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.777930 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.786145 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.827377 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.827439 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdpdb\" (UniqueName: \"kubernetes.io/projected/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-kube-api-access-tdpdb\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.827497 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.827536 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.827567 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.827592 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-config\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.827616 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.827651 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.829264 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.830106 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.830753 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-config\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.830822 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.833997 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.834561 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.834751 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.847168 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdpdb\" (UniqueName: \"kubernetes.io/projected/8b7b212f-2aa6-4fc0-a864-6cd8f1943b71-kube-api-access-tdpdb\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.862803 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.930017 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.930128 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhsms\" (UniqueName: \"kubernetes.io/projected/e5650e0a-bb07-4cce-872c-772038c2ae56-kube-api-access-dhsms\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.930208 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5650e0a-bb07-4cce-872c-772038c2ae56-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.930355 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5650e0a-bb07-4cce-872c-772038c2ae56-config\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.930394 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5650e0a-bb07-4cce-872c-772038c2ae56-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.930439 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e5650e0a-bb07-4cce-872c-772038c2ae56-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.930462 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5650e0a-bb07-4cce-872c-772038c2ae56-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.930486 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5650e0a-bb07-4cce-872c-772038c2ae56-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:43 crc kubenswrapper[4775]: I1216 15:13:43.939837 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:44 crc kubenswrapper[4775]: I1216 15:13:44.032337 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5650e0a-bb07-4cce-872c-772038c2ae56-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:44 crc kubenswrapper[4775]: I1216 15:13:44.032434 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5650e0a-bb07-4cce-872c-772038c2ae56-config\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:44 crc kubenswrapper[4775]: I1216 15:13:44.032479 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5650e0a-bb07-4cce-872c-772038c2ae56-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:44 crc kubenswrapper[4775]: I1216 15:13:44.032518 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e5650e0a-bb07-4cce-872c-772038c2ae56-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:44 crc kubenswrapper[4775]: I1216 15:13:44.032546 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5650e0a-bb07-4cce-872c-772038c2ae56-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:44 crc kubenswrapper[4775]: I1216 15:13:44.032582 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5650e0a-bb07-4cce-872c-772038c2ae56-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:44 crc kubenswrapper[4775]: I1216 15:13:44.032616 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:44 crc kubenswrapper[4775]: I1216 15:13:44.032648 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhsms\" (UniqueName: \"kubernetes.io/projected/e5650e0a-bb07-4cce-872c-772038c2ae56-kube-api-access-dhsms\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:44 crc kubenswrapper[4775]: I1216 15:13:44.033099 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e5650e0a-bb07-4cce-872c-772038c2ae56-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:44 crc kubenswrapper[4775]: I1216 15:13:44.033437 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5650e0a-bb07-4cce-872c-772038c2ae56-config\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:44 crc kubenswrapper[4775]: I1216 15:13:44.033594 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:44 crc kubenswrapper[4775]: I1216 15:13:44.034454 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5650e0a-bb07-4cce-872c-772038c2ae56-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:44 crc kubenswrapper[4775]: I1216 15:13:44.037194 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5650e0a-bb07-4cce-872c-772038c2ae56-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:44 crc kubenswrapper[4775]: I1216 15:13:44.037309 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5650e0a-bb07-4cce-872c-772038c2ae56-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:44 crc kubenswrapper[4775]: I1216 15:13:44.041049 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5650e0a-bb07-4cce-872c-772038c2ae56-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:44 crc kubenswrapper[4775]: I1216 15:13:44.054755 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:44 crc kubenswrapper[4775]: I1216 15:13:44.060185 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhsms\" (UniqueName: \"kubernetes.io/projected/e5650e0a-bb07-4cce-872c-772038c2ae56-kube-api-access-dhsms\") pod \"ovsdbserver-sb-0\" (UID: \"e5650e0a-bb07-4cce-872c-772038c2ae56\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:44 crc kubenswrapper[4775]: I1216 15:13:44.101442 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:47 crc kubenswrapper[4775]: I1216 15:13:47.380100 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 15:13:47 crc kubenswrapper[4775]: E1216 15:13:47.891300 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 16 15:13:47 crc kubenswrapper[4775]: E1216 15:13:47.892247 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4zdlw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-7npzr_openstack(6565fe67-07cb-4f93-aefd-a8ae5622b79a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:13:47 crc kubenswrapper[4775]: E1216 15:13:47.893600 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-7npzr" podUID="6565fe67-07cb-4f93-aefd-a8ae5622b79a" Dec 16 15:13:47 crc kubenswrapper[4775]: W1216 15:13:47.904176 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79fbce0a_9f2b_4548_b886_de6dfe5ff245.slice/crio-5d33d206f49d875ad84b445bececae3b545b62ccef35a30702aefab81b775935 WatchSource:0}: Error finding container 5d33d206f49d875ad84b445bececae3b545b62ccef35a30702aefab81b775935: Status 404 returned error can't find the container with id 5d33d206f49d875ad84b445bececae3b545b62ccef35a30702aefab81b775935 Dec 16 15:13:47 crc kubenswrapper[4775]: E1216 15:13:47.927017 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 16 15:13:47 crc kubenswrapper[4775]: E1216 15:13:47.927293 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xhq9j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-n4jdv_openstack(36fe89c9-3522-47e3-bd98-9a75fb410abd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:13:47 crc kubenswrapper[4775]: E1216 15:13:47.928542 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-n4jdv" podUID="36fe89c9-3522-47e3-bd98-9a75fb410abd" Dec 16 15:13:48 crc kubenswrapper[4775]: I1216 15:13:48.458615 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 15:13:48 crc kubenswrapper[4775]: W1216 15:13:48.460655 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode109503b_1619_4659_956c_24c58c0011a6.slice/crio-d25bafb2a14bd5f5e6e3ae559f6e347db18a38ffcf0a2f63d6c9d0112ae1300b WatchSource:0}: Error finding container d25bafb2a14bd5f5e6e3ae559f6e347db18a38ffcf0a2f63d6c9d0112ae1300b: Status 404 returned error can't find the container with id d25bafb2a14bd5f5e6e3ae559f6e347db18a38ffcf0a2f63d6c9d0112ae1300b Dec 16 15:13:48 crc kubenswrapper[4775]: I1216 15:13:48.539063 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rkmmt"] Dec 16 15:13:48 crc kubenswrapper[4775]: I1216 15:13:48.544537 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 15:13:48 crc kubenswrapper[4775]: I1216 15:13:48.548936 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 15:13:48 crc kubenswrapper[4775]: I1216 15:13:48.735809 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2baed48f-c5f4-4126-b0ed-403a38b18c00","Type":"ContainerStarted","Data":"e6874878b4d5af2f1b8b6c1059db08f7eec00c5e19749427628365cd59a7b8e6"} Dec 16 15:13:48 crc kubenswrapper[4775]: I1216 15:13:48.737383 4775 generic.go:334] "Generic (PLEG): container finished" podID="43c5e083-39ac-488d-a68f-699ba4b264cb" containerID="b42bd2c4189538f0ca529cc0da217eaf64d3d2d3399570a407a815d7c9236da2" exitCode=0 Dec 16 15:13:48 crc kubenswrapper[4775]: I1216 15:13:48.737429 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" event={"ID":"43c5e083-39ac-488d-a68f-699ba4b264cb","Type":"ContainerDied","Data":"b42bd2c4189538f0ca529cc0da217eaf64d3d2d3399570a407a815d7c9236da2"} Dec 16 15:13:48 crc kubenswrapper[4775]: I1216 15:13:48.738983 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e109503b-1619-4659-956c-24c58c0011a6","Type":"ContainerStarted","Data":"d25bafb2a14bd5f5e6e3ae559f6e347db18a38ffcf0a2f63d6c9d0112ae1300b"} Dec 16 15:13:48 crc kubenswrapper[4775]: I1216 15:13:48.740358 4775 generic.go:334] "Generic (PLEG): container finished" podID="06fb4931-8386-4c6f-86c6-2cb5c0a323f0" containerID="c4c31c48bb9cd30ad452a8748b9b30ad2b3cca8757deced584d706b39ded3fc8" exitCode=0 Dec 16 15:13:48 crc kubenswrapper[4775]: I1216 15:13:48.740445 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" event={"ID":"06fb4931-8386-4c6f-86c6-2cb5c0a323f0","Type":"ContainerDied","Data":"c4c31c48bb9cd30ad452a8748b9b30ad2b3cca8757deced584d706b39ded3fc8"} Dec 16 15:13:48 crc kubenswrapper[4775]: I1216 15:13:48.746584 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rkmmt" event={"ID":"b560f177-aa8d-4722-92bd-4ef2755caab0","Type":"ContainerStarted","Data":"627b5a9ba55ab49b0a7394c571f3497716c98fdcc9a5bc03807a7a8a2105e1f2"} Dec 16 15:13:48 crc kubenswrapper[4775]: I1216 15:13:48.749489 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"79fbce0a-9f2b-4548-b886-de6dfe5ff245","Type":"ContainerStarted","Data":"5d33d206f49d875ad84b445bececae3b545b62ccef35a30702aefab81b775935"} Dec 16 15:13:48 crc kubenswrapper[4775]: I1216 15:13:48.750604 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0451a266-fe64-4e36-93f7-9ebb1e547eec","Type":"ContainerStarted","Data":"5f790c6338d10d60a544ec29df3dc224bbe7565c449092d0782500a3b7b0a9a1"} Dec 16 15:13:48 crc kubenswrapper[4775]: W1216 15:13:48.950565 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8bdb272_4c39_4532_926a_f3dcc70af374.slice/crio-7a716ead9ecd68ce60ea860be8d59f1c1c9fadc97a84665bfcf49fb6415ef3b0 WatchSource:0}: Error finding container 7a716ead9ecd68ce60ea860be8d59f1c1c9fadc97a84665bfcf49fb6415ef3b0: Status 404 returned error can't find the container with id 7a716ead9ecd68ce60ea860be8d59f1c1c9fadc97a84665bfcf49fb6415ef3b0 Dec 16 15:13:48 crc kubenswrapper[4775]: I1216 15:13:48.955375 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 16 15:13:48 crc kubenswrapper[4775]: I1216 15:13:48.964666 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.061052 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 15:13:49 crc kubenswrapper[4775]: W1216 15:13:49.094191 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b7b212f_2aa6_4fc0_a864_6cd8f1943b71.slice/crio-ee78142f21d5627367a3706226a6dbb8002560ff3a068d43b3682e1d3cd84855 WatchSource:0}: Error finding container ee78142f21d5627367a3706226a6dbb8002560ff3a068d43b3682e1d3cd84855: Status 404 returned error can't find the container with id ee78142f21d5627367a3706226a6dbb8002560ff3a068d43b3682e1d3cd84855 Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.158920 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-c5f9m"] Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.226624 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-7npzr" Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.231818 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-n4jdv" Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.354958 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6565fe67-07cb-4f93-aefd-a8ae5622b79a-config\") pod \"6565fe67-07cb-4f93-aefd-a8ae5622b79a\" (UID: \"6565fe67-07cb-4f93-aefd-a8ae5622b79a\") " Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.355054 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36fe89c9-3522-47e3-bd98-9a75fb410abd-config\") pod \"36fe89c9-3522-47e3-bd98-9a75fb410abd\" (UID: \"36fe89c9-3522-47e3-bd98-9a75fb410abd\") " Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.355165 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36fe89c9-3522-47e3-bd98-9a75fb410abd-dns-svc\") pod \"36fe89c9-3522-47e3-bd98-9a75fb410abd\" (UID: \"36fe89c9-3522-47e3-bd98-9a75fb410abd\") " Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.355229 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zdlw\" (UniqueName: \"kubernetes.io/projected/6565fe67-07cb-4f93-aefd-a8ae5622b79a-kube-api-access-4zdlw\") pod \"6565fe67-07cb-4f93-aefd-a8ae5622b79a\" (UID: \"6565fe67-07cb-4f93-aefd-a8ae5622b79a\") " Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.355261 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhq9j\" (UniqueName: \"kubernetes.io/projected/36fe89c9-3522-47e3-bd98-9a75fb410abd-kube-api-access-xhq9j\") pod \"36fe89c9-3522-47e3-bd98-9a75fb410abd\" (UID: \"36fe89c9-3522-47e3-bd98-9a75fb410abd\") " Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.355787 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36fe89c9-3522-47e3-bd98-9a75fb410abd-config" (OuterVolumeSpecName: "config") pod "36fe89c9-3522-47e3-bd98-9a75fb410abd" (UID: "36fe89c9-3522-47e3-bd98-9a75fb410abd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.355946 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6565fe67-07cb-4f93-aefd-a8ae5622b79a-config" (OuterVolumeSpecName: "config") pod "6565fe67-07cb-4f93-aefd-a8ae5622b79a" (UID: "6565fe67-07cb-4f93-aefd-a8ae5622b79a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.356221 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36fe89c9-3522-47e3-bd98-9a75fb410abd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "36fe89c9-3522-47e3-bd98-9a75fb410abd" (UID: "36fe89c9-3522-47e3-bd98-9a75fb410abd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.361665 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6565fe67-07cb-4f93-aefd-a8ae5622b79a-kube-api-access-4zdlw" (OuterVolumeSpecName: "kube-api-access-4zdlw") pod "6565fe67-07cb-4f93-aefd-a8ae5622b79a" (UID: "6565fe67-07cb-4f93-aefd-a8ae5622b79a"). InnerVolumeSpecName "kube-api-access-4zdlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.361725 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36fe89c9-3522-47e3-bd98-9a75fb410abd-kube-api-access-xhq9j" (OuterVolumeSpecName: "kube-api-access-xhq9j") pod "36fe89c9-3522-47e3-bd98-9a75fb410abd" (UID: "36fe89c9-3522-47e3-bd98-9a75fb410abd"). InnerVolumeSpecName "kube-api-access-xhq9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.457076 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zdlw\" (UniqueName: \"kubernetes.io/projected/6565fe67-07cb-4f93-aefd-a8ae5622b79a-kube-api-access-4zdlw\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.457110 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhq9j\" (UniqueName: \"kubernetes.io/projected/36fe89c9-3522-47e3-bd98-9a75fb410abd-kube-api-access-xhq9j\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.457123 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6565fe67-07cb-4f93-aefd-a8ae5622b79a-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.457131 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36fe89c9-3522-47e3-bd98-9a75fb410abd-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.457142 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36fe89c9-3522-47e3-bd98-9a75fb410abd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.760022 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71","Type":"ContainerStarted","Data":"ee78142f21d5627367a3706226a6dbb8002560ff3a068d43b3682e1d3cd84855"} Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.765116 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" event={"ID":"43c5e083-39ac-488d-a68f-699ba4b264cb","Type":"ContainerStarted","Data":"25ddee36ccfbe183230bbd7501a720ba2e998da9c8bedf15e4cbf26eefa929c9"} Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.765247 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.773759 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" event={"ID":"06fb4931-8386-4c6f-86c6-2cb5c0a323f0","Type":"ContainerStarted","Data":"d9863ea985ec9d0dd4a2331755a36d6fec94ef841e316040f21a5123ca2ec3e7"} Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.774222 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.775649 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"06cbbbf9-ba64-4343-bde4-61db8b81e2d8","Type":"ContainerStarted","Data":"ff0fc197cca6d40eb8b1770e4810ff88538fd34242a251cca4d42fb30b468318"} Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.777702 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-n4jdv" Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.777757 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-n4jdv" event={"ID":"36fe89c9-3522-47e3-bd98-9a75fb410abd","Type":"ContainerDied","Data":"9d140db3502c578beeb7a87d60b61f0499b3d772286b1c0922f8ea39289852fc"} Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.780018 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-7npzr" Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.780481 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-7npzr" event={"ID":"6565fe67-07cb-4f93-aefd-a8ae5622b79a","Type":"ContainerDied","Data":"5b9149d39901a66788b326d84649e0084b650b56d4697d82cd38d1e7fc512b0f"} Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.782283 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f8bdb272-4c39-4532-926a-f3dcc70af374","Type":"ContainerStarted","Data":"7a716ead9ecd68ce60ea860be8d59f1c1c9fadc97a84665bfcf49fb6415ef3b0"} Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.783396 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-c5f9m" event={"ID":"a42d9c48-0f56-4f2d-8c54-8baebeca09ea","Type":"ContainerStarted","Data":"6511decc5b79ceb2b1360c021f882692264f995b3084d5d2053046401629310f"} Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.793776 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" podStartSLOduration=3.204627824 podStartE2EDuration="19.793751625s" podCreationTimestamp="2025-12-16 15:13:30 +0000 UTC" firstStartedPulling="2025-12-16 15:13:31.46749159 +0000 UTC m=+1136.418570513" lastFinishedPulling="2025-12-16 15:13:48.056615401 +0000 UTC m=+1153.007694314" observedRunningTime="2025-12-16 15:13:49.78659072 +0000 UTC m=+1154.737669643" watchObservedRunningTime="2025-12-16 15:13:49.793751625 +0000 UTC m=+1154.744830538" Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.820294 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.823523 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" podStartSLOduration=6.225806729 podStartE2EDuration="19.823500822s" podCreationTimestamp="2025-12-16 15:13:30 +0000 UTC" firstStartedPulling="2025-12-16 15:13:34.464331489 +0000 UTC m=+1139.415410422" lastFinishedPulling="2025-12-16 15:13:48.062025592 +0000 UTC m=+1153.013104515" observedRunningTime="2025-12-16 15:13:49.811255847 +0000 UTC m=+1154.762334770" watchObservedRunningTime="2025-12-16 15:13:49.823500822 +0000 UTC m=+1154.774579745" Dec 16 15:13:49 crc kubenswrapper[4775]: W1216 15:13:49.828436 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5650e0a_bb07_4cce_872c_772038c2ae56.slice/crio-13caba0031e8ba0aa1b33c4bfe1548be5e8f28d47ffe7da33bd553ec2e2855b4 WatchSource:0}: Error finding container 13caba0031e8ba0aa1b33c4bfe1548be5e8f28d47ffe7da33bd553ec2e2855b4: Status 404 returned error can't find the container with id 13caba0031e8ba0aa1b33c4bfe1548be5e8f28d47ffe7da33bd553ec2e2855b4 Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.861482 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-7npzr"] Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.862713 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-7npzr"] Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.886806 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-n4jdv"] Dec 16 15:13:49 crc kubenswrapper[4775]: I1216 15:13:49.895429 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-n4jdv"] Dec 16 15:13:50 crc kubenswrapper[4775]: I1216 15:13:50.793590 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e5650e0a-bb07-4cce-872c-772038c2ae56","Type":"ContainerStarted","Data":"13caba0031e8ba0aa1b33c4bfe1548be5e8f28d47ffe7da33bd553ec2e2855b4"} Dec 16 15:13:51 crc kubenswrapper[4775]: I1216 15:13:51.356433 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36fe89c9-3522-47e3-bd98-9a75fb410abd" path="/var/lib/kubelet/pods/36fe89c9-3522-47e3-bd98-9a75fb410abd/volumes" Dec 16 15:13:51 crc kubenswrapper[4775]: I1216 15:13:51.356932 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6565fe67-07cb-4f93-aefd-a8ae5622b79a" path="/var/lib/kubelet/pods/6565fe67-07cb-4f93-aefd-a8ae5622b79a/volumes" Dec 16 15:13:55 crc kubenswrapper[4775]: I1216 15:13:55.870062 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" Dec 16 15:13:56 crc kubenswrapper[4775]: I1216 15:13:56.189604 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" Dec 16 15:13:56 crc kubenswrapper[4775]: I1216 15:13:56.238468 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jdj4n"] Dec 16 15:13:56 crc kubenswrapper[4775]: I1216 15:13:56.840417 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2baed48f-c5f4-4126-b0ed-403a38b18c00","Type":"ContainerStarted","Data":"4a8c3e6c74f87f927e2d61f554b19078b78b285ef625cf8c685063113b711186"} Dec 16 15:13:56 crc kubenswrapper[4775]: I1216 15:13:56.846416 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71","Type":"ContainerStarted","Data":"ab61beed55108dbe636a19374c074a4ef017581f655a53870081d43e812c2c34"} Dec 16 15:13:56 crc kubenswrapper[4775]: I1216 15:13:56.848533 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e109503b-1619-4659-956c-24c58c0011a6","Type":"ContainerStarted","Data":"41f21063b31f4cb96e8e9a377385f4cb760d8052d063aa61399eaad03a8dedd2"} Dec 16 15:13:56 crc kubenswrapper[4775]: I1216 15:13:56.850342 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e5650e0a-bb07-4cce-872c-772038c2ae56","Type":"ContainerStarted","Data":"a4116a6e8761558651a7a9508a86eab4f39338ef18ed1f8d75d6b2afdf557088"} Dec 16 15:13:56 crc kubenswrapper[4775]: I1216 15:13:56.852934 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0451a266-fe64-4e36-93f7-9ebb1e547eec","Type":"ContainerStarted","Data":"290cef72b957bb2ee1c39d2653f4f5bb1e67aa6a9764573ac57356610f089b95"} Dec 16 15:13:56 crc kubenswrapper[4775]: I1216 15:13:56.854757 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f8bdb272-4c39-4532-926a-f3dcc70af374","Type":"ContainerStarted","Data":"63e46def5472380a0729b101030a859b787c33dee948ccbe1d93a425f8e1e5d5"} Dec 16 15:13:56 crc kubenswrapper[4775]: I1216 15:13:56.855343 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 16 15:13:56 crc kubenswrapper[4775]: I1216 15:13:56.857172 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-c5f9m" event={"ID":"a42d9c48-0f56-4f2d-8c54-8baebeca09ea","Type":"ContainerStarted","Data":"8dfbfa5e6cd79aea66e5c6d54c35da2e95459b13f115e16cd1a299f17a751a1a"} Dec 16 15:13:56 crc kubenswrapper[4775]: I1216 15:13:56.857335 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" podUID="43c5e083-39ac-488d-a68f-699ba4b264cb" containerName="dnsmasq-dns" containerID="cri-o://25ddee36ccfbe183230bbd7501a720ba2e998da9c8bedf15e4cbf26eefa929c9" gracePeriod=10 Dec 16 15:13:56 crc kubenswrapper[4775]: I1216 15:13:56.974634 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.195313713 podStartE2EDuration="21.974607332s" podCreationTimestamp="2025-12-16 15:13:35 +0000 UTC" firstStartedPulling="2025-12-16 15:13:48.956803785 +0000 UTC m=+1153.907882708" lastFinishedPulling="2025-12-16 15:13:55.736097404 +0000 UTC m=+1160.687176327" observedRunningTime="2025-12-16 15:13:56.940204567 +0000 UTC m=+1161.891283510" watchObservedRunningTime="2025-12-16 15:13:56.974607332 +0000 UTC m=+1161.925686255" Dec 16 15:13:57 crc kubenswrapper[4775]: I1216 15:13:57.865244 4775 generic.go:334] "Generic (PLEG): container finished" podID="43c5e083-39ac-488d-a68f-699ba4b264cb" containerID="25ddee36ccfbe183230bbd7501a720ba2e998da9c8bedf15e4cbf26eefa929c9" exitCode=0 Dec 16 15:13:57 crc kubenswrapper[4775]: I1216 15:13:57.865440 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" event={"ID":"43c5e083-39ac-488d-a68f-699ba4b264cb","Type":"ContainerDied","Data":"25ddee36ccfbe183230bbd7501a720ba2e998da9c8bedf15e4cbf26eefa929c9"} Dec 16 15:13:57 crc kubenswrapper[4775]: I1216 15:13:57.865621 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" event={"ID":"43c5e083-39ac-488d-a68f-699ba4b264cb","Type":"ContainerDied","Data":"ba6e92140b8ffd6a356e09b2c378defebe39f8fb6ca25745859a6e8390d03789"} Dec 16 15:13:57 crc kubenswrapper[4775]: I1216 15:13:57.865634 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba6e92140b8ffd6a356e09b2c378defebe39f8fb6ca25745859a6e8390d03789" Dec 16 15:13:57 crc kubenswrapper[4775]: I1216 15:13:57.866850 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rkmmt" event={"ID":"b560f177-aa8d-4722-92bd-4ef2755caab0","Type":"ContainerStarted","Data":"f8026897e41f0b3a71f633091be311e85e9b7576cba5ae1d9e52e0a6ca5dbd5f"} Dec 16 15:13:57 crc kubenswrapper[4775]: I1216 15:13:57.867874 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-rkmmt" Dec 16 15:13:57 crc kubenswrapper[4775]: I1216 15:13:57.870225 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"79fbce0a-9f2b-4548-b886-de6dfe5ff245","Type":"ContainerStarted","Data":"790a4a60bbabbe93361bb85ff6f9a1546bd650f527fc5aeb455b91ee31cccce3"} Dec 16 15:13:57 crc kubenswrapper[4775]: I1216 15:13:57.872485 4775 generic.go:334] "Generic (PLEG): container finished" podID="a42d9c48-0f56-4f2d-8c54-8baebeca09ea" containerID="8dfbfa5e6cd79aea66e5c6d54c35da2e95459b13f115e16cd1a299f17a751a1a" exitCode=0 Dec 16 15:13:57 crc kubenswrapper[4775]: I1216 15:13:57.872621 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-c5f9m" event={"ID":"a42d9c48-0f56-4f2d-8c54-8baebeca09ea","Type":"ContainerDied","Data":"8dfbfa5e6cd79aea66e5c6d54c35da2e95459b13f115e16cd1a299f17a751a1a"} Dec 16 15:13:57 crc kubenswrapper[4775]: I1216 15:13:57.889829 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-rkmmt" podStartSLOduration=9.742377075 podStartE2EDuration="16.889583911s" podCreationTimestamp="2025-12-16 15:13:41 +0000 UTC" firstStartedPulling="2025-12-16 15:13:48.578758759 +0000 UTC m=+1153.529837682" lastFinishedPulling="2025-12-16 15:13:55.725965595 +0000 UTC m=+1160.677044518" observedRunningTime="2025-12-16 15:13:57.884636696 +0000 UTC m=+1162.835715619" watchObservedRunningTime="2025-12-16 15:13:57.889583911 +0000 UTC m=+1162.840662834" Dec 16 15:13:57 crc kubenswrapper[4775]: I1216 15:13:57.940039 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.021660 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43c5e083-39ac-488d-a68f-699ba4b264cb-config\") pod \"43c5e083-39ac-488d-a68f-699ba4b264cb\" (UID: \"43c5e083-39ac-488d-a68f-699ba4b264cb\") " Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.022053 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43c5e083-39ac-488d-a68f-699ba4b264cb-dns-svc\") pod \"43c5e083-39ac-488d-a68f-699ba4b264cb\" (UID: \"43c5e083-39ac-488d-a68f-699ba4b264cb\") " Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.022134 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czpqk\" (UniqueName: \"kubernetes.io/projected/43c5e083-39ac-488d-a68f-699ba4b264cb-kube-api-access-czpqk\") pod \"43c5e083-39ac-488d-a68f-699ba4b264cb\" (UID: \"43c5e083-39ac-488d-a68f-699ba4b264cb\") " Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.040458 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43c5e083-39ac-488d-a68f-699ba4b264cb-kube-api-access-czpqk" (OuterVolumeSpecName: "kube-api-access-czpqk") pod "43c5e083-39ac-488d-a68f-699ba4b264cb" (UID: "43c5e083-39ac-488d-a68f-699ba4b264cb"). InnerVolumeSpecName "kube-api-access-czpqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.067389 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43c5e083-39ac-488d-a68f-699ba4b264cb-config" (OuterVolumeSpecName: "config") pod "43c5e083-39ac-488d-a68f-699ba4b264cb" (UID: "43c5e083-39ac-488d-a68f-699ba4b264cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.072430 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43c5e083-39ac-488d-a68f-699ba4b264cb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "43c5e083-39ac-488d-a68f-699ba4b264cb" (UID: "43c5e083-39ac-488d-a68f-699ba4b264cb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.126879 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43c5e083-39ac-488d-a68f-699ba4b264cb-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.126934 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43c5e083-39ac-488d-a68f-699ba4b264cb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.126947 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czpqk\" (UniqueName: \"kubernetes.io/projected/43c5e083-39ac-488d-a68f-699ba4b264cb-kube-api-access-czpqk\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.881310 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-c5f9m" event={"ID":"a42d9c48-0f56-4f2d-8c54-8baebeca09ea","Type":"ContainerStarted","Data":"b9b7356b8e3b7109717bcb11c7117af3519dd5ef1d868182c6955ca41e08acb8"} Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.881640 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-c5f9m" event={"ID":"a42d9c48-0f56-4f2d-8c54-8baebeca09ea","Type":"ContainerStarted","Data":"14090104dd28bb6f71e6c5c7dd65abd266446a990f68e629defaddfcb53b7289"} Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.881659 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.884337 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jdj4n" Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.884336 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"06cbbbf9-ba64-4343-bde4-61db8b81e2d8","Type":"ContainerStarted","Data":"eeeb3bf4d0da24759531d7eab23f781a4c354731192bcc208b01ead87fd9cf8c"} Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.884687 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.919667 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-c5f9m" podStartSLOduration=11.362406438 podStartE2EDuration="17.919635748s" podCreationTimestamp="2025-12-16 15:13:41 +0000 UTC" firstStartedPulling="2025-12-16 15:13:49.18222496 +0000 UTC m=+1154.133303883" lastFinishedPulling="2025-12-16 15:13:55.73945427 +0000 UTC m=+1160.690533193" observedRunningTime="2025-12-16 15:13:58.905644067 +0000 UTC m=+1163.856722990" watchObservedRunningTime="2025-12-16 15:13:58.919635748 +0000 UTC m=+1163.870714671" Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.943402 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.973930524 podStartE2EDuration="22.943382857s" podCreationTimestamp="2025-12-16 15:13:36 +0000 UTC" firstStartedPulling="2025-12-16 15:13:48.985170639 +0000 UTC m=+1153.936249562" lastFinishedPulling="2025-12-16 15:13:57.954622972 +0000 UTC m=+1162.905701895" observedRunningTime="2025-12-16 15:13:58.924404899 +0000 UTC m=+1163.875483832" watchObservedRunningTime="2025-12-16 15:13:58.943382857 +0000 UTC m=+1163.894461770" Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.951626 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jdj4n"] Dec 16 15:13:58 crc kubenswrapper[4775]: I1216 15:13:58.957129 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jdj4n"] Dec 16 15:13:59 crc kubenswrapper[4775]: I1216 15:13:59.347139 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43c5e083-39ac-488d-a68f-699ba4b264cb" path="/var/lib/kubelet/pods/43c5e083-39ac-488d-a68f-699ba4b264cb/volumes" Dec 16 15:13:59 crc kubenswrapper[4775]: I1216 15:13:59.891014 4775 generic.go:334] "Generic (PLEG): container finished" podID="2baed48f-c5f4-4126-b0ed-403a38b18c00" containerID="4a8c3e6c74f87f927e2d61f554b19078b78b285ef625cf8c685063113b711186" exitCode=0 Dec 16 15:13:59 crc kubenswrapper[4775]: I1216 15:13:59.891099 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2baed48f-c5f4-4126-b0ed-403a38b18c00","Type":"ContainerDied","Data":"4a8c3e6c74f87f927e2d61f554b19078b78b285ef625cf8c685063113b711186"} Dec 16 15:13:59 crc kubenswrapper[4775]: I1216 15:13:59.895259 4775 generic.go:334] "Generic (PLEG): container finished" podID="e109503b-1619-4659-956c-24c58c0011a6" containerID="41f21063b31f4cb96e8e9a377385f4cb760d8052d063aa61399eaad03a8dedd2" exitCode=0 Dec 16 15:13:59 crc kubenswrapper[4775]: I1216 15:13:59.895369 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e109503b-1619-4659-956c-24c58c0011a6","Type":"ContainerDied","Data":"41f21063b31f4cb96e8e9a377385f4cb760d8052d063aa61399eaad03a8dedd2"} Dec 16 15:13:59 crc kubenswrapper[4775]: I1216 15:13:59.895984 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:14:00 crc kubenswrapper[4775]: I1216 15:14:00.906682 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e109503b-1619-4659-956c-24c58c0011a6","Type":"ContainerStarted","Data":"85ac0eb1e8fccec4846ccb304f5938cc0491956e4909fc6386fd2cb04b3ef303"} Dec 16 15:14:00 crc kubenswrapper[4775]: I1216 15:14:00.909538 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e5650e0a-bb07-4cce-872c-772038c2ae56","Type":"ContainerStarted","Data":"1a8620e2aed751403abc2c9a4a9773a5f5dd2561329cd3b7b1503539c5864a2f"} Dec 16 15:14:00 crc kubenswrapper[4775]: I1216 15:14:00.912907 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2baed48f-c5f4-4126-b0ed-403a38b18c00","Type":"ContainerStarted","Data":"30917e537d76c38148c27a6654f88c23c9e42eab0f68a8b93721bc6f933d94dc"} Dec 16 15:14:00 crc kubenswrapper[4775]: I1216 15:14:00.917954 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8b7b212f-2aa6-4fc0-a864-6cd8f1943b71","Type":"ContainerStarted","Data":"57f6a9b781e8197a7c0f16e8fd51a4ad1dcfa8d54013fc55742ebf528e741b6c"} Dec 16 15:14:00 crc kubenswrapper[4775]: I1216 15:14:00.938511 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.670851509 podStartE2EDuration="27.938484681s" podCreationTimestamp="2025-12-16 15:13:33 +0000 UTC" firstStartedPulling="2025-12-16 15:13:48.463108264 +0000 UTC m=+1153.414187187" lastFinishedPulling="2025-12-16 15:13:55.730741446 +0000 UTC m=+1160.681820359" observedRunningTime="2025-12-16 15:14:00.929451776 +0000 UTC m=+1165.880530709" watchObservedRunningTime="2025-12-16 15:14:00.938484681 +0000 UTC m=+1165.889563644" Dec 16 15:14:00 crc kubenswrapper[4775]: I1216 15:14:00.958169 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.040968396 podStartE2EDuration="18.958146241s" podCreationTimestamp="2025-12-16 15:13:42 +0000 UTC" firstStartedPulling="2025-12-16 15:13:49.104038115 +0000 UTC m=+1154.055117038" lastFinishedPulling="2025-12-16 15:14:00.02121596 +0000 UTC m=+1164.972294883" observedRunningTime="2025-12-16 15:14:00.953399252 +0000 UTC m=+1165.904478225" watchObservedRunningTime="2025-12-16 15:14:00.958146241 +0000 UTC m=+1165.909225174" Dec 16 15:14:00 crc kubenswrapper[4775]: I1216 15:14:00.981356 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.655958671 podStartE2EDuration="18.981336312s" podCreationTimestamp="2025-12-16 15:13:42 +0000 UTC" firstStartedPulling="2025-12-16 15:13:49.83230888 +0000 UTC m=+1154.783387803" lastFinishedPulling="2025-12-16 15:14:00.157686521 +0000 UTC m=+1165.108765444" observedRunningTime="2025-12-16 15:14:00.979289368 +0000 UTC m=+1165.930368291" watchObservedRunningTime="2025-12-16 15:14:00.981336312 +0000 UTC m=+1165.932415235" Dec 16 15:14:01 crc kubenswrapper[4775]: I1216 15:14:01.940842 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 16 15:14:01 crc kubenswrapper[4775]: I1216 15:14:01.988216 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 16 15:14:02 crc kubenswrapper[4775]: I1216 15:14:02.012420 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=22.863717578 podStartE2EDuration="30.01240116s" podCreationTimestamp="2025-12-16 15:13:32 +0000 UTC" firstStartedPulling="2025-12-16 15:13:48.582080254 +0000 UTC m=+1153.533159177" lastFinishedPulling="2025-12-16 15:13:55.730763836 +0000 UTC m=+1160.681842759" observedRunningTime="2025-12-16 15:14:00.999288657 +0000 UTC m=+1165.950367590" watchObservedRunningTime="2025-12-16 15:14:02.01240116 +0000 UTC m=+1166.963480083" Dec 16 15:14:02 crc kubenswrapper[4775]: I1216 15:14:02.101623 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 16 15:14:02 crc kubenswrapper[4775]: I1216 15:14:02.136443 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 16 15:14:02 crc kubenswrapper[4775]: I1216 15:14:02.930736 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 16 15:14:02 crc kubenswrapper[4775]: I1216 15:14:02.930803 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 16 15:14:02 crc kubenswrapper[4775]: I1216 15:14:02.966396 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 16 15:14:02 crc kubenswrapper[4775]: I1216 15:14:02.978708 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.176538 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-tc5v9"] Dec 16 15:14:03 crc kubenswrapper[4775]: E1216 15:14:03.177272 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43c5e083-39ac-488d-a68f-699ba4b264cb" containerName="dnsmasq-dns" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.177292 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="43c5e083-39ac-488d-a68f-699ba4b264cb" containerName="dnsmasq-dns" Dec 16 15:14:03 crc kubenswrapper[4775]: E1216 15:14:03.177315 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43c5e083-39ac-488d-a68f-699ba4b264cb" containerName="init" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.177323 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="43c5e083-39ac-488d-a68f-699ba4b264cb" containerName="init" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.177477 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="43c5e083-39ac-488d-a68f-699ba4b264cb" containerName="dnsmasq-dns" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.178319 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-tc5v9" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.180076 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.191922 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-tc5v9"] Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.222320 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-nwdtm"] Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.223374 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.225625 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.228613 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nwdtm"] Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.315327 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2a606e5-65a2-41fa-82ed-a6c785528a8e-config\") pod \"dnsmasq-dns-7f896c8c65-tc5v9\" (UID: \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\") " pod="openstack/dnsmasq-dns-7f896c8c65-tc5v9" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.315386 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mckm9\" (UniqueName: \"kubernetes.io/projected/096c5279-0aa8-4641-8b5f-66e41869ec98-kube-api-access-mckm9\") pod \"ovn-controller-metrics-nwdtm\" (UID: \"096c5279-0aa8-4641-8b5f-66e41869ec98\") " pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.315414 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/096c5279-0aa8-4641-8b5f-66e41869ec98-combined-ca-bundle\") pod \"ovn-controller-metrics-nwdtm\" (UID: \"096c5279-0aa8-4641-8b5f-66e41869ec98\") " pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.315438 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/096c5279-0aa8-4641-8b5f-66e41869ec98-ovs-rundir\") pod \"ovn-controller-metrics-nwdtm\" (UID: \"096c5279-0aa8-4641-8b5f-66e41869ec98\") " pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.315766 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2a606e5-65a2-41fa-82ed-a6c785528a8e-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-tc5v9\" (UID: \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\") " pod="openstack/dnsmasq-dns-7f896c8c65-tc5v9" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.315837 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/096c5279-0aa8-4641-8b5f-66e41869ec98-ovn-rundir\") pod \"ovn-controller-metrics-nwdtm\" (UID: \"096c5279-0aa8-4641-8b5f-66e41869ec98\") " pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.316036 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/096c5279-0aa8-4641-8b5f-66e41869ec98-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nwdtm\" (UID: \"096c5279-0aa8-4641-8b5f-66e41869ec98\") " pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.316147 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2a606e5-65a2-41fa-82ed-a6c785528a8e-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-tc5v9\" (UID: \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\") " pod="openstack/dnsmasq-dns-7f896c8c65-tc5v9" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.316206 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxh85\" (UniqueName: \"kubernetes.io/projected/a2a606e5-65a2-41fa-82ed-a6c785528a8e-kube-api-access-pxh85\") pod \"dnsmasq-dns-7f896c8c65-tc5v9\" (UID: \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\") " pod="openstack/dnsmasq-dns-7f896c8c65-tc5v9" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.316297 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/096c5279-0aa8-4641-8b5f-66e41869ec98-config\") pod \"ovn-controller-metrics-nwdtm\" (UID: \"096c5279-0aa8-4641-8b5f-66e41869ec98\") " pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.319403 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-tc5v9"] Dec 16 15:14:03 crc kubenswrapper[4775]: E1216 15:14:03.320140 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-pxh85 ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7f896c8c65-tc5v9" podUID="a2a606e5-65a2-41fa-82ed-a6c785528a8e" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.408703 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-46rrz"] Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.410450 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.418048 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2a606e5-65a2-41fa-82ed-a6c785528a8e-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-tc5v9\" (UID: \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\") " pod="openstack/dnsmasq-dns-7f896c8c65-tc5v9" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.418113 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/096c5279-0aa8-4641-8b5f-66e41869ec98-ovn-rundir\") pod \"ovn-controller-metrics-nwdtm\" (UID: \"096c5279-0aa8-4641-8b5f-66e41869ec98\") " pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.418153 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/096c5279-0aa8-4641-8b5f-66e41869ec98-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nwdtm\" (UID: \"096c5279-0aa8-4641-8b5f-66e41869ec98\") " pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.418192 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2a606e5-65a2-41fa-82ed-a6c785528a8e-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-tc5v9\" (UID: \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\") " pod="openstack/dnsmasq-dns-7f896c8c65-tc5v9" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.418228 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxh85\" (UniqueName: \"kubernetes.io/projected/a2a606e5-65a2-41fa-82ed-a6c785528a8e-kube-api-access-pxh85\") pod \"dnsmasq-dns-7f896c8c65-tc5v9\" (UID: \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\") " pod="openstack/dnsmasq-dns-7f896c8c65-tc5v9" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.418253 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/096c5279-0aa8-4641-8b5f-66e41869ec98-config\") pod \"ovn-controller-metrics-nwdtm\" (UID: \"096c5279-0aa8-4641-8b5f-66e41869ec98\") " pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.418259 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.418275 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2a606e5-65a2-41fa-82ed-a6c785528a8e-config\") pod \"dnsmasq-dns-7f896c8c65-tc5v9\" (UID: \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\") " pod="openstack/dnsmasq-dns-7f896c8c65-tc5v9" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.418595 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mckm9\" (UniqueName: \"kubernetes.io/projected/096c5279-0aa8-4641-8b5f-66e41869ec98-kube-api-access-mckm9\") pod \"ovn-controller-metrics-nwdtm\" (UID: \"096c5279-0aa8-4641-8b5f-66e41869ec98\") " pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.418659 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/096c5279-0aa8-4641-8b5f-66e41869ec98-combined-ca-bundle\") pod \"ovn-controller-metrics-nwdtm\" (UID: \"096c5279-0aa8-4641-8b5f-66e41869ec98\") " pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.418722 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/096c5279-0aa8-4641-8b5f-66e41869ec98-ovs-rundir\") pod \"ovn-controller-metrics-nwdtm\" (UID: \"096c5279-0aa8-4641-8b5f-66e41869ec98\") " pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.419161 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/096c5279-0aa8-4641-8b5f-66e41869ec98-ovs-rundir\") pod \"ovn-controller-metrics-nwdtm\" (UID: \"096c5279-0aa8-4641-8b5f-66e41869ec98\") " pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.419204 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2a606e5-65a2-41fa-82ed-a6c785528a8e-config\") pod \"dnsmasq-dns-7f896c8c65-tc5v9\" (UID: \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\") " pod="openstack/dnsmasq-dns-7f896c8c65-tc5v9" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.419974 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2a606e5-65a2-41fa-82ed-a6c785528a8e-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-tc5v9\" (UID: \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\") " pod="openstack/dnsmasq-dns-7f896c8c65-tc5v9" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.420660 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/096c5279-0aa8-4641-8b5f-66e41869ec98-ovn-rundir\") pod \"ovn-controller-metrics-nwdtm\" (UID: \"096c5279-0aa8-4641-8b5f-66e41869ec98\") " pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.420700 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/096c5279-0aa8-4641-8b5f-66e41869ec98-config\") pod \"ovn-controller-metrics-nwdtm\" (UID: \"096c5279-0aa8-4641-8b5f-66e41869ec98\") " pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.421653 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2a606e5-65a2-41fa-82ed-a6c785528a8e-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-tc5v9\" (UID: \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\") " pod="openstack/dnsmasq-dns-7f896c8c65-tc5v9" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.431065 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-46rrz"] Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.440754 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/096c5279-0aa8-4641-8b5f-66e41869ec98-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nwdtm\" (UID: \"096c5279-0aa8-4641-8b5f-66e41869ec98\") " pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.457064 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/096c5279-0aa8-4641-8b5f-66e41869ec98-combined-ca-bundle\") pod \"ovn-controller-metrics-nwdtm\" (UID: \"096c5279-0aa8-4641-8b5f-66e41869ec98\") " pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.469841 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.476763 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxh85\" (UniqueName: \"kubernetes.io/projected/a2a606e5-65a2-41fa-82ed-a6c785528a8e-kube-api-access-pxh85\") pod \"dnsmasq-dns-7f896c8c65-tc5v9\" (UID: \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\") " pod="openstack/dnsmasq-dns-7f896c8c65-tc5v9" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.488790 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.493762 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mckm9\" (UniqueName: \"kubernetes.io/projected/096c5279-0aa8-4641-8b5f-66e41869ec98-kube-api-access-mckm9\") pod \"ovn-controller-metrics-nwdtm\" (UID: \"096c5279-0aa8-4641-8b5f-66e41869ec98\") " pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.497390 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.497509 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-8tclt" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.497677 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.497828 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.521352 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-46rrz\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.521451 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-config\") pod \"dnsmasq-dns-86db49b7ff-46rrz\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.521513 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsds2\" (UniqueName: \"kubernetes.io/projected/57de99ad-acd4-4a4b-9e31-a88c7b417639-kube-api-access-lsds2\") pod \"dnsmasq-dns-86db49b7ff-46rrz\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.521558 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-46rrz\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.521597 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-46rrz\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.533900 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.541468 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nwdtm" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.623863 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/470f973b-96da-437e-a5ce-e53dbadd9276-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.623953 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-config\") pod \"dnsmasq-dns-86db49b7ff-46rrz\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.624003 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsds2\" (UniqueName: \"kubernetes.io/projected/57de99ad-acd4-4a4b-9e31-a88c7b417639-kube-api-access-lsds2\") pod \"dnsmasq-dns-86db49b7ff-46rrz\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.624029 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/470f973b-96da-437e-a5ce-e53dbadd9276-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.624046 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-46rrz\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.624069 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/470f973b-96da-437e-a5ce-e53dbadd9276-scripts\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.624095 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-46rrz\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.624133 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/470f973b-96da-437e-a5ce-e53dbadd9276-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.624150 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/470f973b-96da-437e-a5ce-e53dbadd9276-config\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.624174 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-46rrz\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.624206 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj6gf\" (UniqueName: \"kubernetes.io/projected/470f973b-96da-437e-a5ce-e53dbadd9276-kube-api-access-dj6gf\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.624221 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/470f973b-96da-437e-a5ce-e53dbadd9276-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.625345 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-config\") pod \"dnsmasq-dns-86db49b7ff-46rrz\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.626148 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-46rrz\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.627484 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-46rrz\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.627693 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-46rrz\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.669736 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsds2\" (UniqueName: \"kubernetes.io/projected/57de99ad-acd4-4a4b-9e31-a88c7b417639-kube-api-access-lsds2\") pod \"dnsmasq-dns-86db49b7ff-46rrz\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.725739 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/470f973b-96da-437e-a5ce-e53dbadd9276-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.726030 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/470f973b-96da-437e-a5ce-e53dbadd9276-scripts\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.726109 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/470f973b-96da-437e-a5ce-e53dbadd9276-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.726133 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/470f973b-96da-437e-a5ce-e53dbadd9276-config\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.726190 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj6gf\" (UniqueName: \"kubernetes.io/projected/470f973b-96da-437e-a5ce-e53dbadd9276-kube-api-access-dj6gf\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.726212 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/470f973b-96da-437e-a5ce-e53dbadd9276-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.726249 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/470f973b-96da-437e-a5ce-e53dbadd9276-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.727186 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/470f973b-96da-437e-a5ce-e53dbadd9276-scripts\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.727468 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/470f973b-96da-437e-a5ce-e53dbadd9276-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.727488 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/470f973b-96da-437e-a5ce-e53dbadd9276-config\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.730837 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/470f973b-96da-437e-a5ce-e53dbadd9276-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.730872 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/470f973b-96da-437e-a5ce-e53dbadd9276-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.731070 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/470f973b-96da-437e-a5ce-e53dbadd9276-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.744939 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj6gf\" (UniqueName: \"kubernetes.io/projected/470f973b-96da-437e-a5ce-e53dbadd9276-kube-api-access-dj6gf\") pod \"ovn-northd-0\" (UID: \"470f973b-96da-437e-a5ce-e53dbadd9276\") " pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.811984 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.812032 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.936922 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-tc5v9" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.937569 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.959405 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 15:14:03 crc kubenswrapper[4775]: I1216 15:14:03.960693 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-tc5v9" Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.032126 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2a606e5-65a2-41fa-82ed-a6c785528a8e-ovsdbserver-sb\") pod \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\" (UID: \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\") " Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.032233 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2a606e5-65a2-41fa-82ed-a6c785528a8e-config\") pod \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\" (UID: \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\") " Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.032366 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2a606e5-65a2-41fa-82ed-a6c785528a8e-dns-svc\") pod \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\" (UID: \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\") " Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.032420 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxh85\" (UniqueName: \"kubernetes.io/projected/a2a606e5-65a2-41fa-82ed-a6c785528a8e-kube-api-access-pxh85\") pod \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\" (UID: \"a2a606e5-65a2-41fa-82ed-a6c785528a8e\") " Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.035091 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2a606e5-65a2-41fa-82ed-a6c785528a8e-config" (OuterVolumeSpecName: "config") pod "a2a606e5-65a2-41fa-82ed-a6c785528a8e" (UID: "a2a606e5-65a2-41fa-82ed-a6c785528a8e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.035336 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2a606e5-65a2-41fa-82ed-a6c785528a8e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a2a606e5-65a2-41fa-82ed-a6c785528a8e" (UID: "a2a606e5-65a2-41fa-82ed-a6c785528a8e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.035687 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2a606e5-65a2-41fa-82ed-a6c785528a8e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a2a606e5-65a2-41fa-82ed-a6c785528a8e" (UID: "a2a606e5-65a2-41fa-82ed-a6c785528a8e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.045600 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2a606e5-65a2-41fa-82ed-a6c785528a8e-kube-api-access-pxh85" (OuterVolumeSpecName: "kube-api-access-pxh85") pod "a2a606e5-65a2-41fa-82ed-a6c785528a8e" (UID: "a2a606e5-65a2-41fa-82ed-a6c785528a8e"). InnerVolumeSpecName "kube-api-access-pxh85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.135108 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2a606e5-65a2-41fa-82ed-a6c785528a8e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.135415 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxh85\" (UniqueName: \"kubernetes.io/projected/a2a606e5-65a2-41fa-82ed-a6c785528a8e-kube-api-access-pxh85\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.135553 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2a606e5-65a2-41fa-82ed-a6c785528a8e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.135638 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2a606e5-65a2-41fa-82ed-a6c785528a8e-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.219767 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nwdtm"] Dec 16 15:14:04 crc kubenswrapper[4775]: W1216 15:14:04.229252 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod096c5279_0aa8_4641_8b5f_66e41869ec98.slice/crio-92c19c3e9014807fc303bc4c95a25090e6ee0fa8c75a84c2f240379ca37a9344 WatchSource:0}: Error finding container 92c19c3e9014807fc303bc4c95a25090e6ee0fa8c75a84c2f240379ca37a9344: Status 404 returned error can't find the container with id 92c19c3e9014807fc303bc4c95a25090e6ee0fa8c75a84c2f240379ca37a9344 Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.552430 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-46rrz"] Dec 16 15:14:04 crc kubenswrapper[4775]: W1216 15:14:04.562561 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57de99ad_acd4_4a4b_9e31_a88c7b417639.slice/crio-20eb93e11d697a88de9c10ac7e17848ce8991d354361179313d53b36d163cb64 WatchSource:0}: Error finding container 20eb93e11d697a88de9c10ac7e17848ce8991d354361179313d53b36d163cb64: Status 404 returned error can't find the container with id 20eb93e11d697a88de9c10ac7e17848ce8991d354361179313d53b36d163cb64 Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.651690 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 15:14:04 crc kubenswrapper[4775]: W1216 15:14:04.659775 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod470f973b_96da_437e_a5ce_e53dbadd9276.slice/crio-05fbc17d3f63b56f9d612e3380c7419065621d7d80282d8f3a720a6ca22f0711 WatchSource:0}: Error finding container 05fbc17d3f63b56f9d612e3380c7419065621d7d80282d8f3a720a6ca22f0711: Status 404 returned error can't find the container with id 05fbc17d3f63b56f9d612e3380c7419065621d7d80282d8f3a720a6ca22f0711 Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.946154 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"470f973b-96da-437e-a5ce-e53dbadd9276","Type":"ContainerStarted","Data":"05fbc17d3f63b56f9d612e3380c7419065621d7d80282d8f3a720a6ca22f0711"} Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.948439 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nwdtm" event={"ID":"096c5279-0aa8-4641-8b5f-66e41869ec98","Type":"ContainerStarted","Data":"59bc9bb72c2ac75cff0a9bf875d25c7012b70ff484fcacb70f42db610d7e82d9"} Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.948466 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nwdtm" event={"ID":"096c5279-0aa8-4641-8b5f-66e41869ec98","Type":"ContainerStarted","Data":"92c19c3e9014807fc303bc4c95a25090e6ee0fa8c75a84c2f240379ca37a9344"} Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.955711 4775 generic.go:334] "Generic (PLEG): container finished" podID="57de99ad-acd4-4a4b-9e31-a88c7b417639" containerID="a88d1a23dbe739b41d61ffbf02a7a258f23193f8a039dd58bc82652407e04cd9" exitCode=0 Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.955827 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" event={"ID":"57de99ad-acd4-4a4b-9e31-a88c7b417639","Type":"ContainerDied","Data":"a88d1a23dbe739b41d61ffbf02a7a258f23193f8a039dd58bc82652407e04cd9"} Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.955863 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" event={"ID":"57de99ad-acd4-4a4b-9e31-a88c7b417639","Type":"ContainerStarted","Data":"20eb93e11d697a88de9c10ac7e17848ce8991d354361179313d53b36d163cb64"} Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.956048 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-tc5v9" Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.973104 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.973779 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 16 15:14:04 crc kubenswrapper[4775]: I1216 15:14:04.997253 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-nwdtm" podStartSLOduration=1.997232471 podStartE2EDuration="1.997232471s" podCreationTimestamp="2025-12-16 15:14:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:14:04.974078862 +0000 UTC m=+1169.925157805" watchObservedRunningTime="2025-12-16 15:14:04.997232471 +0000 UTC m=+1169.948311394" Dec 16 15:14:05 crc kubenswrapper[4775]: I1216 15:14:05.025030 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-tc5v9"] Dec 16 15:14:05 crc kubenswrapper[4775]: I1216 15:14:05.035528 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-tc5v9"] Dec 16 15:14:05 crc kubenswrapper[4775]: I1216 15:14:05.347837 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2a606e5-65a2-41fa-82ed-a6c785528a8e" path="/var/lib/kubelet/pods/a2a606e5-65a2-41fa-82ed-a6c785528a8e/volumes" Dec 16 15:14:05 crc kubenswrapper[4775]: I1216 15:14:05.518365 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 16 15:14:05 crc kubenswrapper[4775]: I1216 15:14:05.965053 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" event={"ID":"57de99ad-acd4-4a4b-9e31-a88c7b417639","Type":"ContainerStarted","Data":"8677cd2353fec79f0406ca142b889212b521c689e32931778b94c98e48e4499d"} Dec 16 15:14:05 crc kubenswrapper[4775]: I1216 15:14:05.965270 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:05 crc kubenswrapper[4775]: I1216 15:14:05.984483 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" podStartSLOduration=2.984459138 podStartE2EDuration="2.984459138s" podCreationTimestamp="2025-12-16 15:14:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:14:05.982585549 +0000 UTC m=+1170.933664472" watchObservedRunningTime="2025-12-16 15:14:05.984459138 +0000 UTC m=+1170.935538081" Dec 16 15:14:06 crc kubenswrapper[4775]: I1216 15:14:06.975087 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"470f973b-96da-437e-a5ce-e53dbadd9276","Type":"ContainerStarted","Data":"b355a8ae33e4eedb9cd3407bcafa2156939d07f4db4bea9924fa5d3b2e3c7bf0"} Dec 16 15:14:06 crc kubenswrapper[4775]: I1216 15:14:06.975164 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"470f973b-96da-437e-a5ce-e53dbadd9276","Type":"ContainerStarted","Data":"2710dfa6ee11eb2262732f1660915e6425a7c6851c5633c3b517fd6a33910c0a"} Dec 16 15:14:06 crc kubenswrapper[4775]: I1216 15:14:06.975194 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.003553 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.636637975 podStartE2EDuration="4.003526669s" podCreationTimestamp="2025-12-16 15:14:03 +0000 UTC" firstStartedPulling="2025-12-16 15:14:04.662627745 +0000 UTC m=+1169.613706668" lastFinishedPulling="2025-12-16 15:14:06.029516439 +0000 UTC m=+1170.980595362" observedRunningTime="2025-12-16 15:14:06.997680864 +0000 UTC m=+1171.948759797" watchObservedRunningTime="2025-12-16 15:14:07.003526669 +0000 UTC m=+1171.954605582" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.275209 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-46rrz"] Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.317281 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-qhnq6"] Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.319058 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.333332 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.396439 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qhnq6"] Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.498252 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfvkl\" (UniqueName: \"kubernetes.io/projected/0428ec83-659c-47e6-8b58-385b582e628e-kube-api-access-hfvkl\") pod \"dnsmasq-dns-698758b865-qhnq6\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.498354 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-dns-svc\") pod \"dnsmasq-dns-698758b865-qhnq6\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.498483 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-config\") pod \"dnsmasq-dns-698758b865-qhnq6\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.498547 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-qhnq6\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.498613 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-qhnq6\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.600242 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfvkl\" (UniqueName: \"kubernetes.io/projected/0428ec83-659c-47e6-8b58-385b582e628e-kube-api-access-hfvkl\") pod \"dnsmasq-dns-698758b865-qhnq6\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.600647 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-dns-svc\") pod \"dnsmasq-dns-698758b865-qhnq6\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.600712 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-config\") pod \"dnsmasq-dns-698758b865-qhnq6\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.600751 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-qhnq6\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.600802 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-qhnq6\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.602028 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-config\") pod \"dnsmasq-dns-698758b865-qhnq6\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.602167 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-dns-svc\") pod \"dnsmasq-dns-698758b865-qhnq6\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.602176 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-qhnq6\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.602417 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-qhnq6\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.621910 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfvkl\" (UniqueName: \"kubernetes.io/projected/0428ec83-659c-47e6-8b58-385b582e628e-kube-api-access-hfvkl\") pod \"dnsmasq-dns-698758b865-qhnq6\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.659631 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:07 crc kubenswrapper[4775]: I1216 15:14:07.979402 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" podUID="57de99ad-acd4-4a4b-9e31-a88c7b417639" containerName="dnsmasq-dns" containerID="cri-o://8677cd2353fec79f0406ca142b889212b521c689e32931778b94c98e48e4499d" gracePeriod=10 Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.179819 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qhnq6"] Dec 16 15:14:08 crc kubenswrapper[4775]: W1216 15:14:08.197207 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0428ec83_659c_47e6_8b58_385b582e628e.slice/crio-17484b5a54b9b01d9b988858a0cc00b41311073aaee44d03d4da66d1fa1893a5 WatchSource:0}: Error finding container 17484b5a54b9b01d9b988858a0cc00b41311073aaee44d03d4da66d1fa1893a5: Status 404 returned error can't find the container with id 17484b5a54b9b01d9b988858a0cc00b41311073aaee44d03d4da66d1fa1893a5 Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.493484 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.517336 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.528723 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.528972 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.529000 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.530485 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-vkfgp" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.536856 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.633396 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8b23fde4-e483-4825-969c-94ebc8396511-cache\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.633587 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8b23fde4-e483-4825-969c-94ebc8396511-lock\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.633805 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.633921 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.633967 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f79wj\" (UniqueName: \"kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-kube-api-access-f79wj\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.730092 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-tp2tw"] Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.731155 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.733355 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.733583 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.734174 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.735199 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.735250 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f79wj\" (UniqueName: \"kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-kube-api-access-f79wj\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.735333 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8b23fde4-e483-4825-969c-94ebc8396511-cache\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.735374 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8b23fde4-e483-4825-969c-94ebc8396511-lock\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.735453 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:08 crc kubenswrapper[4775]: E1216 15:14:08.735447 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 15:14:08 crc kubenswrapper[4775]: E1216 15:14:08.735496 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 15:14:08 crc kubenswrapper[4775]: E1216 15:14:08.735574 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift podName:8b23fde4-e483-4825-969c-94ebc8396511 nodeName:}" failed. No retries permitted until 2025-12-16 15:14:09.235545231 +0000 UTC m=+1174.186624224 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift") pod "swift-storage-0" (UID: "8b23fde4-e483-4825-969c-94ebc8396511") : configmap "swift-ring-files" not found Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.735877 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/swift-storage-0" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.736152 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8b23fde4-e483-4825-969c-94ebc8396511-lock\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.736162 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8b23fde4-e483-4825-969c-94ebc8396511-cache\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.743705 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-tp2tw"] Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.757536 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f79wj\" (UniqueName: \"kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-kube-api-access-f79wj\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.760014 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.837672 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edd66213-7818-408d-a6ec-73c6e3b39321-scripts\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.837736 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzpp8\" (UniqueName: \"kubernetes.io/projected/edd66213-7818-408d-a6ec-73c6e3b39321-kube-api-access-kzpp8\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.837761 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/edd66213-7818-408d-a6ec-73c6e3b39321-dispersionconf\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.837957 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/edd66213-7818-408d-a6ec-73c6e3b39321-swiftconf\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.838012 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/edd66213-7818-408d-a6ec-73c6e3b39321-etc-swift\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.838309 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/edd66213-7818-408d-a6ec-73c6e3b39321-ring-data-devices\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.838404 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd66213-7818-408d-a6ec-73c6e3b39321-combined-ca-bundle\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.939568 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/edd66213-7818-408d-a6ec-73c6e3b39321-ring-data-devices\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.939653 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd66213-7818-408d-a6ec-73c6e3b39321-combined-ca-bundle\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.939801 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edd66213-7818-408d-a6ec-73c6e3b39321-scripts\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.939828 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/edd66213-7818-408d-a6ec-73c6e3b39321-dispersionconf\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.939859 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzpp8\" (UniqueName: \"kubernetes.io/projected/edd66213-7818-408d-a6ec-73c6e3b39321-kube-api-access-kzpp8\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.940001 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/edd66213-7818-408d-a6ec-73c6e3b39321-swiftconf\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.940030 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/edd66213-7818-408d-a6ec-73c6e3b39321-etc-swift\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.940545 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/edd66213-7818-408d-a6ec-73c6e3b39321-ring-data-devices\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.940721 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/edd66213-7818-408d-a6ec-73c6e3b39321-etc-swift\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.940900 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edd66213-7818-408d-a6ec-73c6e3b39321-scripts\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.944167 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/edd66213-7818-408d-a6ec-73c6e3b39321-swiftconf\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.948916 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/edd66213-7818-408d-a6ec-73c6e3b39321-dispersionconf\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.959542 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd66213-7818-408d-a6ec-73c6e3b39321-combined-ca-bundle\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.965860 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzpp8\" (UniqueName: \"kubernetes.io/projected/edd66213-7818-408d-a6ec-73c6e3b39321-kube-api-access-kzpp8\") pod \"swift-ring-rebalance-tp2tw\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.990644 4775 generic.go:334] "Generic (PLEG): container finished" podID="57de99ad-acd4-4a4b-9e31-a88c7b417639" containerID="8677cd2353fec79f0406ca142b889212b521c689e32931778b94c98e48e4499d" exitCode=0 Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.990720 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" event={"ID":"57de99ad-acd4-4a4b-9e31-a88c7b417639","Type":"ContainerDied","Data":"8677cd2353fec79f0406ca142b889212b521c689e32931778b94c98e48e4499d"} Dec 16 15:14:08 crc kubenswrapper[4775]: I1216 15:14:08.992554 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qhnq6" event={"ID":"0428ec83-659c-47e6-8b58-385b582e628e","Type":"ContainerStarted","Data":"17484b5a54b9b01d9b988858a0cc00b41311073aaee44d03d4da66d1fa1893a5"} Dec 16 15:14:09 crc kubenswrapper[4775]: I1216 15:14:09.049436 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:09 crc kubenswrapper[4775]: I1216 15:14:09.246759 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:09 crc kubenswrapper[4775]: E1216 15:14:09.247035 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 15:14:09 crc kubenswrapper[4775]: E1216 15:14:09.247054 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 15:14:09 crc kubenswrapper[4775]: E1216 15:14:09.247115 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift podName:8b23fde4-e483-4825-969c-94ebc8396511 nodeName:}" failed. No retries permitted until 2025-12-16 15:14:10.247096714 +0000 UTC m=+1175.198175637 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift") pod "swift-storage-0" (UID: "8b23fde4-e483-4825-969c-94ebc8396511") : configmap "swift-ring-files" not found Dec 16 15:14:09 crc kubenswrapper[4775]: I1216 15:14:09.556754 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-tp2tw"] Dec 16 15:14:09 crc kubenswrapper[4775]: W1216 15:14:09.559544 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedd66213_7818_408d_a6ec_73c6e3b39321.slice/crio-f99c8b0fe4db2210e8ae12db5695495c7f39a11ce192119719fa2ec2c1566b68 WatchSource:0}: Error finding container f99c8b0fe4db2210e8ae12db5695495c7f39a11ce192119719fa2ec2c1566b68: Status 404 returned error can't find the container with id f99c8b0fe4db2210e8ae12db5695495c7f39a11ce192119719fa2ec2c1566b68 Dec 16 15:14:10 crc kubenswrapper[4775]: I1216 15:14:10.000493 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tp2tw" event={"ID":"edd66213-7818-408d-a6ec-73c6e3b39321","Type":"ContainerStarted","Data":"f99c8b0fe4db2210e8ae12db5695495c7f39a11ce192119719fa2ec2c1566b68"} Dec 16 15:14:10 crc kubenswrapper[4775]: I1216 15:14:10.267943 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:10 crc kubenswrapper[4775]: E1216 15:14:10.268106 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 15:14:10 crc kubenswrapper[4775]: E1216 15:14:10.268132 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 15:14:10 crc kubenswrapper[4775]: E1216 15:14:10.268192 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift podName:8b23fde4-e483-4825-969c-94ebc8396511 nodeName:}" failed. No retries permitted until 2025-12-16 15:14:12.268171768 +0000 UTC m=+1177.219250691 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift") pod "swift-storage-0" (UID: "8b23fde4-e483-4825-969c-94ebc8396511") : configmap "swift-ring-files" not found Dec 16 15:14:12 crc kubenswrapper[4775]: I1216 15:14:12.303285 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:12 crc kubenswrapper[4775]: E1216 15:14:12.303558 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 15:14:12 crc kubenswrapper[4775]: E1216 15:14:12.303870 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 15:14:12 crc kubenswrapper[4775]: E1216 15:14:12.304022 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift podName:8b23fde4-e483-4825-969c-94ebc8396511 nodeName:}" failed. No retries permitted until 2025-12-16 15:14:16.303982366 +0000 UTC m=+1181.255061329 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift") pod "swift-storage-0" (UID: "8b23fde4-e483-4825-969c-94ebc8396511") : configmap "swift-ring-files" not found Dec 16 15:14:13 crc kubenswrapper[4775]: I1216 15:14:13.942032 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" podUID="57de99ad-acd4-4a4b-9e31-a88c7b417639" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: connect: connection refused" Dec 16 15:14:16 crc kubenswrapper[4775]: I1216 15:14:16.055969 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qhnq6" event={"ID":"0428ec83-659c-47e6-8b58-385b582e628e","Type":"ContainerStarted","Data":"4f60592c97995273032227a56f3ae07cfcc75f667b62ba0ba41de4fcd53e1ec2"} Dec 16 15:14:16 crc kubenswrapper[4775]: I1216 15:14:16.064519 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 16 15:14:16 crc kubenswrapper[4775]: I1216 15:14:16.214669 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 16 15:14:16 crc kubenswrapper[4775]: I1216 15:14:16.270741 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 16 15:14:16 crc kubenswrapper[4775]: I1216 15:14:16.392178 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 16 15:14:16 crc kubenswrapper[4775]: I1216 15:14:16.395348 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:16 crc kubenswrapper[4775]: E1216 15:14:16.395679 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 15:14:16 crc kubenswrapper[4775]: E1216 15:14:16.395710 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 15:14:16 crc kubenswrapper[4775]: E1216 15:14:16.395772 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift podName:8b23fde4-e483-4825-969c-94ebc8396511 nodeName:}" failed. No retries permitted until 2025-12-16 15:14:24.395750648 +0000 UTC m=+1189.346829571 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift") pod "swift-storage-0" (UID: "8b23fde4-e483-4825-969c-94ebc8396511") : configmap "swift-ring-files" not found Dec 16 15:14:17 crc kubenswrapper[4775]: I1216 15:14:17.071966 4775 generic.go:334] "Generic (PLEG): container finished" podID="0428ec83-659c-47e6-8b58-385b582e628e" containerID="4f60592c97995273032227a56f3ae07cfcc75f667b62ba0ba41de4fcd53e1ec2" exitCode=0 Dec 16 15:14:17 crc kubenswrapper[4775]: I1216 15:14:17.072038 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qhnq6" event={"ID":"0428ec83-659c-47e6-8b58-385b582e628e","Type":"ContainerDied","Data":"4f60592c97995273032227a56f3ae07cfcc75f667b62ba0ba41de4fcd53e1ec2"} Dec 16 15:14:18 crc kubenswrapper[4775]: I1216 15:14:18.910143 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.046278 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.064683 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-dns-svc\") pod \"57de99ad-acd4-4a4b-9e31-a88c7b417639\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.064733 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-config\") pod \"57de99ad-acd4-4a4b-9e31-a88c7b417639\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.064874 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-ovsdbserver-nb\") pod \"57de99ad-acd4-4a4b-9e31-a88c7b417639\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.064930 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-ovsdbserver-sb\") pod \"57de99ad-acd4-4a4b-9e31-a88c7b417639\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.064987 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsds2\" (UniqueName: \"kubernetes.io/projected/57de99ad-acd4-4a4b-9e31-a88c7b417639-kube-api-access-lsds2\") pod \"57de99ad-acd4-4a4b-9e31-a88c7b417639\" (UID: \"57de99ad-acd4-4a4b-9e31-a88c7b417639\") " Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.085841 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57de99ad-acd4-4a4b-9e31-a88c7b417639-kube-api-access-lsds2" (OuterVolumeSpecName: "kube-api-access-lsds2") pod "57de99ad-acd4-4a4b-9e31-a88c7b417639" (UID: "57de99ad-acd4-4a4b-9e31-a88c7b417639"). InnerVolumeSpecName "kube-api-access-lsds2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.100433 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tp2tw" event={"ID":"edd66213-7818-408d-a6ec-73c6e3b39321","Type":"ContainerStarted","Data":"7f63d5a6a3c5079741aa599d9a1d9dc2302e27ce63ed41426377e2eb432c8291"} Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.104177 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" event={"ID":"57de99ad-acd4-4a4b-9e31-a88c7b417639","Type":"ContainerDied","Data":"20eb93e11d697a88de9c10ac7e17848ce8991d354361179313d53b36d163cb64"} Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.104242 4775 scope.go:117] "RemoveContainer" containerID="8677cd2353fec79f0406ca142b889212b521c689e32931778b94c98e48e4499d" Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.104194 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-46rrz" Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.107641 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qhnq6" event={"ID":"0428ec83-659c-47e6-8b58-385b582e628e","Type":"ContainerStarted","Data":"c1ddbf24df25553da481fc30024757f9e28643e73a7df6cc7ab6e64fd7f8faf1"} Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.107826 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.153121 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "57de99ad-acd4-4a4b-9e31-a88c7b417639" (UID: "57de99ad-acd4-4a4b-9e31-a88c7b417639"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.153276 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "57de99ad-acd4-4a4b-9e31-a88c7b417639" (UID: "57de99ad-acd4-4a4b-9e31-a88c7b417639"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.158687 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-config" (OuterVolumeSpecName: "config") pod "57de99ad-acd4-4a4b-9e31-a88c7b417639" (UID: "57de99ad-acd4-4a4b-9e31-a88c7b417639"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.161401 4775 scope.go:117] "RemoveContainer" containerID="a88d1a23dbe739b41d61ffbf02a7a258f23193f8a039dd58bc82652407e04cd9" Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.163839 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-tp2tw" podStartSLOduration=1.938065884 podStartE2EDuration="11.163824316s" podCreationTimestamp="2025-12-16 15:14:08 +0000 UTC" firstStartedPulling="2025-12-16 15:14:09.56232696 +0000 UTC m=+1174.513405873" lastFinishedPulling="2025-12-16 15:14:18.788085382 +0000 UTC m=+1183.739164305" observedRunningTime="2025-12-16 15:14:19.125096065 +0000 UTC m=+1184.076175008" watchObservedRunningTime="2025-12-16 15:14:19.163824316 +0000 UTC m=+1184.114903239" Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.166642 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.166670 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsds2\" (UniqueName: \"kubernetes.io/projected/57de99ad-acd4-4a4b-9e31-a88c7b417639-kube-api-access-lsds2\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.166681 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.166691 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.168552 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "57de99ad-acd4-4a4b-9e31-a88c7b417639" (UID: "57de99ad-acd4-4a4b-9e31-a88c7b417639"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.169578 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-qhnq6" podStartSLOduration=12.169563207 podStartE2EDuration="12.169563207s" podCreationTimestamp="2025-12-16 15:14:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:14:19.155667379 +0000 UTC m=+1184.106746302" watchObservedRunningTime="2025-12-16 15:14:19.169563207 +0000 UTC m=+1184.120642130" Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.268451 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57de99ad-acd4-4a4b-9e31-a88c7b417639-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.430571 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-46rrz"] Dec 16 15:14:19 crc kubenswrapper[4775]: I1216 15:14:19.437880 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-46rrz"] Dec 16 15:14:21 crc kubenswrapper[4775]: I1216 15:14:21.348446 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57de99ad-acd4-4a4b-9e31-a88c7b417639" path="/var/lib/kubelet/pods/57de99ad-acd4-4a4b-9e31-a88c7b417639/volumes" Dec 16 15:14:24 crc kubenswrapper[4775]: I1216 15:14:24.482716 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:24 crc kubenswrapper[4775]: E1216 15:14:24.482965 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 15:14:24 crc kubenswrapper[4775]: E1216 15:14:24.483549 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 15:14:24 crc kubenswrapper[4775]: E1216 15:14:24.483678 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift podName:8b23fde4-e483-4825-969c-94ebc8396511 nodeName:}" failed. No retries permitted until 2025-12-16 15:14:40.483642944 +0000 UTC m=+1205.434721907 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift") pod "swift-storage-0" (UID: "8b23fde4-e483-4825-969c-94ebc8396511") : configmap "swift-ring-files" not found Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.380798 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-mxzpt"] Dec 16 15:14:25 crc kubenswrapper[4775]: E1216 15:14:25.381632 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57de99ad-acd4-4a4b-9e31-a88c7b417639" containerName="dnsmasq-dns" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.381717 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="57de99ad-acd4-4a4b-9e31-a88c7b417639" containerName="dnsmasq-dns" Dec 16 15:14:25 crc kubenswrapper[4775]: E1216 15:14:25.381805 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57de99ad-acd4-4a4b-9e31-a88c7b417639" containerName="init" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.381864 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="57de99ad-acd4-4a4b-9e31-a88c7b417639" containerName="init" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.382143 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="57de99ad-acd4-4a4b-9e31-a88c7b417639" containerName="dnsmasq-dns" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.382941 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mxzpt" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.392433 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6a2c-account-create-update-l9k8g"] Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.394017 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6a2c-account-create-update-l9k8g" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.410460 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.439278 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6a2c-account-create-update-l9k8g"] Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.454195 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mxzpt"] Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.506950 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41974328-a6d3-4c9f-af51-edd0faa04b5a-operator-scripts\") pod \"keystone-6a2c-account-create-update-l9k8g\" (UID: \"41974328-a6d3-4c9f-af51-edd0faa04b5a\") " pod="openstack/keystone-6a2c-account-create-update-l9k8g" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.507971 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzstv\" (UniqueName: \"kubernetes.io/projected/bc656f33-c012-450b-b263-81b4375a3d58-kube-api-access-wzstv\") pod \"keystone-db-create-mxzpt\" (UID: \"bc656f33-c012-450b-b263-81b4375a3d58\") " pod="openstack/keystone-db-create-mxzpt" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.508134 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-997l9\" (UniqueName: \"kubernetes.io/projected/41974328-a6d3-4c9f-af51-edd0faa04b5a-kube-api-access-997l9\") pod \"keystone-6a2c-account-create-update-l9k8g\" (UID: \"41974328-a6d3-4c9f-af51-edd0faa04b5a\") " pod="openstack/keystone-6a2c-account-create-update-l9k8g" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.508273 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc656f33-c012-450b-b263-81b4375a3d58-operator-scripts\") pod \"keystone-db-create-mxzpt\" (UID: \"bc656f33-c012-450b-b263-81b4375a3d58\") " pod="openstack/keystone-db-create-mxzpt" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.540142 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-jfprw"] Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.541735 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jfprw" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.564367 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jfprw"] Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.611165 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzstv\" (UniqueName: \"kubernetes.io/projected/bc656f33-c012-450b-b263-81b4375a3d58-kube-api-access-wzstv\") pod \"keystone-db-create-mxzpt\" (UID: \"bc656f33-c012-450b-b263-81b4375a3d58\") " pod="openstack/keystone-db-create-mxzpt" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.611270 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-997l9\" (UniqueName: \"kubernetes.io/projected/41974328-a6d3-4c9f-af51-edd0faa04b5a-kube-api-access-997l9\") pod \"keystone-6a2c-account-create-update-l9k8g\" (UID: \"41974328-a6d3-4c9f-af51-edd0faa04b5a\") " pod="openstack/keystone-6a2c-account-create-update-l9k8g" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.611351 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc656f33-c012-450b-b263-81b4375a3d58-operator-scripts\") pod \"keystone-db-create-mxzpt\" (UID: \"bc656f33-c012-450b-b263-81b4375a3d58\") " pod="openstack/keystone-db-create-mxzpt" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.611434 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41974328-a6d3-4c9f-af51-edd0faa04b5a-operator-scripts\") pod \"keystone-6a2c-account-create-update-l9k8g\" (UID: \"41974328-a6d3-4c9f-af51-edd0faa04b5a\") " pod="openstack/keystone-6a2c-account-create-update-l9k8g" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.612516 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41974328-a6d3-4c9f-af51-edd0faa04b5a-operator-scripts\") pod \"keystone-6a2c-account-create-update-l9k8g\" (UID: \"41974328-a6d3-4c9f-af51-edd0faa04b5a\") " pod="openstack/keystone-6a2c-account-create-update-l9k8g" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.613005 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc656f33-c012-450b-b263-81b4375a3d58-operator-scripts\") pod \"keystone-db-create-mxzpt\" (UID: \"bc656f33-c012-450b-b263-81b4375a3d58\") " pod="openstack/keystone-db-create-mxzpt" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.619818 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-4fce-account-create-update-8h6sf"] Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.621358 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4fce-account-create-update-8h6sf" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.624688 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.647778 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzstv\" (UniqueName: \"kubernetes.io/projected/bc656f33-c012-450b-b263-81b4375a3d58-kube-api-access-wzstv\") pod \"keystone-db-create-mxzpt\" (UID: \"bc656f33-c012-450b-b263-81b4375a3d58\") " pod="openstack/keystone-db-create-mxzpt" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.652668 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-997l9\" (UniqueName: \"kubernetes.io/projected/41974328-a6d3-4c9f-af51-edd0faa04b5a-kube-api-access-997l9\") pod \"keystone-6a2c-account-create-update-l9k8g\" (UID: \"41974328-a6d3-4c9f-af51-edd0faa04b5a\") " pod="openstack/keystone-6a2c-account-create-update-l9k8g" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.673818 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4fce-account-create-update-8h6sf"] Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.704457 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-rc6p4"] Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.705432 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mxzpt" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.706463 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rc6p4" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.712107 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-rc6p4"] Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.715545 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzjtl\" (UniqueName: \"kubernetes.io/projected/4078625a-a3c7-45f0-85e1-56c07c7b85b9-kube-api-access-xzjtl\") pod \"placement-db-create-jfprw\" (UID: \"4078625a-a3c7-45f0-85e1-56c07c7b85b9\") " pod="openstack/placement-db-create-jfprw" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.715644 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kxr7\" (UniqueName: \"kubernetes.io/projected/9f018ae8-2c5c-41f5-a7e5-be48d695db30-kube-api-access-6kxr7\") pod \"placement-4fce-account-create-update-8h6sf\" (UID: \"9f018ae8-2c5c-41f5-a7e5-be48d695db30\") " pod="openstack/placement-4fce-account-create-update-8h6sf" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.715734 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4078625a-a3c7-45f0-85e1-56c07c7b85b9-operator-scripts\") pod \"placement-db-create-jfprw\" (UID: \"4078625a-a3c7-45f0-85e1-56c07c7b85b9\") " pod="openstack/placement-db-create-jfprw" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.715855 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f018ae8-2c5c-41f5-a7e5-be48d695db30-operator-scripts\") pod \"placement-4fce-account-create-update-8h6sf\" (UID: \"9f018ae8-2c5c-41f5-a7e5-be48d695db30\") " pod="openstack/placement-4fce-account-create-update-8h6sf" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.717423 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6a2c-account-create-update-l9k8g" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.804787 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5251-account-create-update-fmcxk"] Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.806962 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5251-account-create-update-fmcxk" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.817903 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzjtl\" (UniqueName: \"kubernetes.io/projected/4078625a-a3c7-45f0-85e1-56c07c7b85b9-kube-api-access-xzjtl\") pod \"placement-db-create-jfprw\" (UID: \"4078625a-a3c7-45f0-85e1-56c07c7b85b9\") " pod="openstack/placement-db-create-jfprw" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.819262 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kxr7\" (UniqueName: \"kubernetes.io/projected/9f018ae8-2c5c-41f5-a7e5-be48d695db30-kube-api-access-6kxr7\") pod \"placement-4fce-account-create-update-8h6sf\" (UID: \"9f018ae8-2c5c-41f5-a7e5-be48d695db30\") " pod="openstack/placement-4fce-account-create-update-8h6sf" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.819407 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5frw\" (UniqueName: \"kubernetes.io/projected/afe8ec15-4c89-4cf1-a6b7-5e3534660d7b-kube-api-access-g5frw\") pod \"glance-db-create-rc6p4\" (UID: \"afe8ec15-4c89-4cf1-a6b7-5e3534660d7b\") " pod="openstack/glance-db-create-rc6p4" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.819512 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4078625a-a3c7-45f0-85e1-56c07c7b85b9-operator-scripts\") pod \"placement-db-create-jfprw\" (UID: \"4078625a-a3c7-45f0-85e1-56c07c7b85b9\") " pod="openstack/placement-db-create-jfprw" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.822432 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afe8ec15-4c89-4cf1-a6b7-5e3534660d7b-operator-scripts\") pod \"glance-db-create-rc6p4\" (UID: \"afe8ec15-4c89-4cf1-a6b7-5e3534660d7b\") " pod="openstack/glance-db-create-rc6p4" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.822578 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f018ae8-2c5c-41f5-a7e5-be48d695db30-operator-scripts\") pod \"placement-4fce-account-create-update-8h6sf\" (UID: \"9f018ae8-2c5c-41f5-a7e5-be48d695db30\") " pod="openstack/placement-4fce-account-create-update-8h6sf" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.823267 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f018ae8-2c5c-41f5-a7e5-be48d695db30-operator-scripts\") pod \"placement-4fce-account-create-update-8h6sf\" (UID: \"9f018ae8-2c5c-41f5-a7e5-be48d695db30\") " pod="openstack/placement-4fce-account-create-update-8h6sf" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.818523 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.822255 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4078625a-a3c7-45f0-85e1-56c07c7b85b9-operator-scripts\") pod \"placement-db-create-jfprw\" (UID: \"4078625a-a3c7-45f0-85e1-56c07c7b85b9\") " pod="openstack/placement-db-create-jfprw" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.827137 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5251-account-create-update-fmcxk"] Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.848978 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzjtl\" (UniqueName: \"kubernetes.io/projected/4078625a-a3c7-45f0-85e1-56c07c7b85b9-kube-api-access-xzjtl\") pod \"placement-db-create-jfprw\" (UID: \"4078625a-a3c7-45f0-85e1-56c07c7b85b9\") " pod="openstack/placement-db-create-jfprw" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.853255 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kxr7\" (UniqueName: \"kubernetes.io/projected/9f018ae8-2c5c-41f5-a7e5-be48d695db30-kube-api-access-6kxr7\") pod \"placement-4fce-account-create-update-8h6sf\" (UID: \"9f018ae8-2c5c-41f5-a7e5-be48d695db30\") " pod="openstack/placement-4fce-account-create-update-8h6sf" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.858137 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jfprw" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.924867 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/735a5a7b-bafc-467b-9fa0-9a6755f9f04c-operator-scripts\") pod \"glance-5251-account-create-update-fmcxk\" (UID: \"735a5a7b-bafc-467b-9fa0-9a6755f9f04c\") " pod="openstack/glance-5251-account-create-update-fmcxk" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.924952 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5frw\" (UniqueName: \"kubernetes.io/projected/afe8ec15-4c89-4cf1-a6b7-5e3534660d7b-kube-api-access-g5frw\") pod \"glance-db-create-rc6p4\" (UID: \"afe8ec15-4c89-4cf1-a6b7-5e3534660d7b\") " pod="openstack/glance-db-create-rc6p4" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.925037 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd8sd\" (UniqueName: \"kubernetes.io/projected/735a5a7b-bafc-467b-9fa0-9a6755f9f04c-kube-api-access-zd8sd\") pod \"glance-5251-account-create-update-fmcxk\" (UID: \"735a5a7b-bafc-467b-9fa0-9a6755f9f04c\") " pod="openstack/glance-5251-account-create-update-fmcxk" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.925077 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afe8ec15-4c89-4cf1-a6b7-5e3534660d7b-operator-scripts\") pod \"glance-db-create-rc6p4\" (UID: \"afe8ec15-4c89-4cf1-a6b7-5e3534660d7b\") " pod="openstack/glance-db-create-rc6p4" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.925774 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afe8ec15-4c89-4cf1-a6b7-5e3534660d7b-operator-scripts\") pod \"glance-db-create-rc6p4\" (UID: \"afe8ec15-4c89-4cf1-a6b7-5e3534660d7b\") " pod="openstack/glance-db-create-rc6p4" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.944256 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4fce-account-create-update-8h6sf" Dec 16 15:14:25 crc kubenswrapper[4775]: I1216 15:14:25.950337 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5frw\" (UniqueName: \"kubernetes.io/projected/afe8ec15-4c89-4cf1-a6b7-5e3534660d7b-kube-api-access-g5frw\") pod \"glance-db-create-rc6p4\" (UID: \"afe8ec15-4c89-4cf1-a6b7-5e3534660d7b\") " pod="openstack/glance-db-create-rc6p4" Dec 16 15:14:26 crc kubenswrapper[4775]: I1216 15:14:26.029617 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/735a5a7b-bafc-467b-9fa0-9a6755f9f04c-operator-scripts\") pod \"glance-5251-account-create-update-fmcxk\" (UID: \"735a5a7b-bafc-467b-9fa0-9a6755f9f04c\") " pod="openstack/glance-5251-account-create-update-fmcxk" Dec 16 15:14:26 crc kubenswrapper[4775]: I1216 15:14:26.030188 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd8sd\" (UniqueName: \"kubernetes.io/projected/735a5a7b-bafc-467b-9fa0-9a6755f9f04c-kube-api-access-zd8sd\") pod \"glance-5251-account-create-update-fmcxk\" (UID: \"735a5a7b-bafc-467b-9fa0-9a6755f9f04c\") " pod="openstack/glance-5251-account-create-update-fmcxk" Dec 16 15:14:26 crc kubenswrapper[4775]: I1216 15:14:26.030865 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/735a5a7b-bafc-467b-9fa0-9a6755f9f04c-operator-scripts\") pod \"glance-5251-account-create-update-fmcxk\" (UID: \"735a5a7b-bafc-467b-9fa0-9a6755f9f04c\") " pod="openstack/glance-5251-account-create-update-fmcxk" Dec 16 15:14:26 crc kubenswrapper[4775]: I1216 15:14:26.054858 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd8sd\" (UniqueName: \"kubernetes.io/projected/735a5a7b-bafc-467b-9fa0-9a6755f9f04c-kube-api-access-zd8sd\") pod \"glance-5251-account-create-update-fmcxk\" (UID: \"735a5a7b-bafc-467b-9fa0-9a6755f9f04c\") " pod="openstack/glance-5251-account-create-update-fmcxk" Dec 16 15:14:26 crc kubenswrapper[4775]: I1216 15:14:26.108781 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rc6p4" Dec 16 15:14:26 crc kubenswrapper[4775]: I1216 15:14:26.150395 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5251-account-create-update-fmcxk" Dec 16 15:14:26 crc kubenswrapper[4775]: I1216 15:14:26.318645 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mxzpt"] Dec 16 15:14:26 crc kubenswrapper[4775]: I1216 15:14:26.326607 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6a2c-account-create-update-l9k8g"] Dec 16 15:14:26 crc kubenswrapper[4775]: W1216 15:14:26.341824 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41974328_a6d3_4c9f_af51_edd0faa04b5a.slice/crio-715c2f60984e92b2aef232807b2ed9a4ed2eed2b794046c967ac13c6d1454206 WatchSource:0}: Error finding container 715c2f60984e92b2aef232807b2ed9a4ed2eed2b794046c967ac13c6d1454206: Status 404 returned error can't find the container with id 715c2f60984e92b2aef232807b2ed9a4ed2eed2b794046c967ac13c6d1454206 Dec 16 15:14:26 crc kubenswrapper[4775]: I1216 15:14:26.418728 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4fce-account-create-update-8h6sf"] Dec 16 15:14:26 crc kubenswrapper[4775]: I1216 15:14:26.434336 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jfprw"] Dec 16 15:14:26 crc kubenswrapper[4775]: W1216 15:14:26.494433 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f018ae8_2c5c_41f5_a7e5_be48d695db30.slice/crio-5aa35e6e4f6302f44b645a84b3b15e87275e08b99f08ce4830d9f8755daa95c1 WatchSource:0}: Error finding container 5aa35e6e4f6302f44b645a84b3b15e87275e08b99f08ce4830d9f8755daa95c1: Status 404 returned error can't find the container with id 5aa35e6e4f6302f44b645a84b3b15e87275e08b99f08ce4830d9f8755daa95c1 Dec 16 15:14:26 crc kubenswrapper[4775]: I1216 15:14:26.741090 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-rc6p4"] Dec 16 15:14:26 crc kubenswrapper[4775]: W1216 15:14:26.745147 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafe8ec15_4c89_4cf1_a6b7_5e3534660d7b.slice/crio-6367fa6c73972162901c034f779d26124dac048e835fba085ba7b8344e0b6de3 WatchSource:0}: Error finding container 6367fa6c73972162901c034f779d26124dac048e835fba085ba7b8344e0b6de3: Status 404 returned error can't find the container with id 6367fa6c73972162901c034f779d26124dac048e835fba085ba7b8344e0b6de3 Dec 16 15:14:26 crc kubenswrapper[4775]: I1216 15:14:26.806997 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5251-account-create-update-fmcxk"] Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.180338 4775 generic.go:334] "Generic (PLEG): container finished" podID="edd66213-7818-408d-a6ec-73c6e3b39321" containerID="7f63d5a6a3c5079741aa599d9a1d9dc2302e27ce63ed41426377e2eb432c8291" exitCode=0 Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.180406 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tp2tw" event={"ID":"edd66213-7818-408d-a6ec-73c6e3b39321","Type":"ContainerDied","Data":"7f63d5a6a3c5079741aa599d9a1d9dc2302e27ce63ed41426377e2eb432c8291"} Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.187278 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5251-account-create-update-fmcxk" event={"ID":"735a5a7b-bafc-467b-9fa0-9a6755f9f04c","Type":"ContainerStarted","Data":"286b43cff16d3ab2f58f13dcd2341ba27cf3c8975a0ae8f7c56abf733c84da61"} Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.190603 4775 generic.go:334] "Generic (PLEG): container finished" podID="4078625a-a3c7-45f0-85e1-56c07c7b85b9" containerID="c26c455a750f5de4ed93a47c05e72cb5218308a1eb988aeafcaaf08b9a362479" exitCode=0 Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.190659 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jfprw" event={"ID":"4078625a-a3c7-45f0-85e1-56c07c7b85b9","Type":"ContainerDied","Data":"c26c455a750f5de4ed93a47c05e72cb5218308a1eb988aeafcaaf08b9a362479"} Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.190721 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jfprw" event={"ID":"4078625a-a3c7-45f0-85e1-56c07c7b85b9","Type":"ContainerStarted","Data":"5bfa81b30a338622936e3978984b700e9efa11d219a52362380d4ee36ce23cf6"} Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.193192 4775 generic.go:334] "Generic (PLEG): container finished" podID="41974328-a6d3-4c9f-af51-edd0faa04b5a" containerID="09e2649ac93ec514357db06eaa4b993541638b063d5c2d1c90e62105d892ccfe" exitCode=0 Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.193233 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6a2c-account-create-update-l9k8g" event={"ID":"41974328-a6d3-4c9f-af51-edd0faa04b5a","Type":"ContainerDied","Data":"09e2649ac93ec514357db06eaa4b993541638b063d5c2d1c90e62105d892ccfe"} Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.193265 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6a2c-account-create-update-l9k8g" event={"ID":"41974328-a6d3-4c9f-af51-edd0faa04b5a","Type":"ContainerStarted","Data":"715c2f60984e92b2aef232807b2ed9a4ed2eed2b794046c967ac13c6d1454206"} Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.195803 4775 generic.go:334] "Generic (PLEG): container finished" podID="bc656f33-c012-450b-b263-81b4375a3d58" containerID="91014cd1d19cc2aa60b3956d63e5e9bf85784cf5d2b17449a249b98dfccb5c5f" exitCode=0 Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.195874 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mxzpt" event={"ID":"bc656f33-c012-450b-b263-81b4375a3d58","Type":"ContainerDied","Data":"91014cd1d19cc2aa60b3956d63e5e9bf85784cf5d2b17449a249b98dfccb5c5f"} Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.195927 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mxzpt" event={"ID":"bc656f33-c012-450b-b263-81b4375a3d58","Type":"ContainerStarted","Data":"885abbc8c6957115ab9ae65eaabe1990624805339f447fa62b7a62952ebe672f"} Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.197996 4775 generic.go:334] "Generic (PLEG): container finished" podID="9f018ae8-2c5c-41f5-a7e5-be48d695db30" containerID="f949ba9f03f7fdcf9c5179f6be2903cc3b7f7a73b24900acca12da063e1390f0" exitCode=0 Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.198097 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4fce-account-create-update-8h6sf" event={"ID":"9f018ae8-2c5c-41f5-a7e5-be48d695db30","Type":"ContainerDied","Data":"f949ba9f03f7fdcf9c5179f6be2903cc3b7f7a73b24900acca12da063e1390f0"} Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.198134 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4fce-account-create-update-8h6sf" event={"ID":"9f018ae8-2c5c-41f5-a7e5-be48d695db30","Type":"ContainerStarted","Data":"5aa35e6e4f6302f44b645a84b3b15e87275e08b99f08ce4830d9f8755daa95c1"} Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.199624 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rc6p4" event={"ID":"afe8ec15-4c89-4cf1-a6b7-5e3534660d7b","Type":"ContainerStarted","Data":"6367fa6c73972162901c034f779d26124dac048e835fba085ba7b8344e0b6de3"} Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.661135 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.727228 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s9zj2"] Dec 16 15:14:27 crc kubenswrapper[4775]: I1216 15:14:27.727496 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" podUID="06fb4931-8386-4c6f-86c6-2cb5c0a323f0" containerName="dnsmasq-dns" containerID="cri-o://d9863ea985ec9d0dd4a2331755a36d6fec94ef841e316040f21a5123ca2ec3e7" gracePeriod=10 Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.211509 4775 generic.go:334] "Generic (PLEG): container finished" podID="afe8ec15-4c89-4cf1-a6b7-5e3534660d7b" containerID="1cfc513e3134d62bc62418cd1ff86ffb049bdbd4b1c16aae98aaf777cf74bada" exitCode=0 Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.211580 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rc6p4" event={"ID":"afe8ec15-4c89-4cf1-a6b7-5e3534660d7b","Type":"ContainerDied","Data":"1cfc513e3134d62bc62418cd1ff86ffb049bdbd4b1c16aae98aaf777cf74bada"} Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.213506 4775 generic.go:334] "Generic (PLEG): container finished" podID="735a5a7b-bafc-467b-9fa0-9a6755f9f04c" containerID="b425b98f1568c51e1d08a64b8b7618011cd885d6be9f44f576b060412bc5caac" exitCode=0 Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.213552 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5251-account-create-update-fmcxk" event={"ID":"735a5a7b-bafc-467b-9fa0-9a6755f9f04c","Type":"ContainerDied","Data":"b425b98f1568c51e1d08a64b8b7618011cd885d6be9f44f576b060412bc5caac"} Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.222548 4775 generic.go:334] "Generic (PLEG): container finished" podID="06fb4931-8386-4c6f-86c6-2cb5c0a323f0" containerID="d9863ea985ec9d0dd4a2331755a36d6fec94ef841e316040f21a5123ca2ec3e7" exitCode=0 Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.222645 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" event={"ID":"06fb4931-8386-4c6f-86c6-2cb5c0a323f0","Type":"ContainerDied","Data":"d9863ea985ec9d0dd4a2331755a36d6fec94ef841e316040f21a5123ca2ec3e7"} Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.222719 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" event={"ID":"06fb4931-8386-4c6f-86c6-2cb5c0a323f0","Type":"ContainerDied","Data":"0d0603c7b3e92f7d6a1f3267dd76a4ea243eadf37770d89126b2436054d0f139"} Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.222739 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d0603c7b3e92f7d6a1f3267dd76a4ea243eadf37770d89126b2436054d0f139" Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.329462 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.417779 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ssbl\" (UniqueName: \"kubernetes.io/projected/06fb4931-8386-4c6f-86c6-2cb5c0a323f0-kube-api-access-9ssbl\") pod \"06fb4931-8386-4c6f-86c6-2cb5c0a323f0\" (UID: \"06fb4931-8386-4c6f-86c6-2cb5c0a323f0\") " Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.417996 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06fb4931-8386-4c6f-86c6-2cb5c0a323f0-config\") pod \"06fb4931-8386-4c6f-86c6-2cb5c0a323f0\" (UID: \"06fb4931-8386-4c6f-86c6-2cb5c0a323f0\") " Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.418045 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06fb4931-8386-4c6f-86c6-2cb5c0a323f0-dns-svc\") pod \"06fb4931-8386-4c6f-86c6-2cb5c0a323f0\" (UID: \"06fb4931-8386-4c6f-86c6-2cb5c0a323f0\") " Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.427603 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06fb4931-8386-4c6f-86c6-2cb5c0a323f0-kube-api-access-9ssbl" (OuterVolumeSpecName: "kube-api-access-9ssbl") pod "06fb4931-8386-4c6f-86c6-2cb5c0a323f0" (UID: "06fb4931-8386-4c6f-86c6-2cb5c0a323f0"). InnerVolumeSpecName "kube-api-access-9ssbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.472993 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06fb4931-8386-4c6f-86c6-2cb5c0a323f0-config" (OuterVolumeSpecName: "config") pod "06fb4931-8386-4c6f-86c6-2cb5c0a323f0" (UID: "06fb4931-8386-4c6f-86c6-2cb5c0a323f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.497104 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06fb4931-8386-4c6f-86c6-2cb5c0a323f0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "06fb4931-8386-4c6f-86c6-2cb5c0a323f0" (UID: "06fb4931-8386-4c6f-86c6-2cb5c0a323f0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.520427 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06fb4931-8386-4c6f-86c6-2cb5c0a323f0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.520474 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ssbl\" (UniqueName: \"kubernetes.io/projected/06fb4931-8386-4c6f-86c6-2cb5c0a323f0-kube-api-access-9ssbl\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.520489 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06fb4931-8386-4c6f-86c6-2cb5c0a323f0-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.807605 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4fce-account-create-update-8h6sf" Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.927552 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kxr7\" (UniqueName: \"kubernetes.io/projected/9f018ae8-2c5c-41f5-a7e5-be48d695db30-kube-api-access-6kxr7\") pod \"9f018ae8-2c5c-41f5-a7e5-be48d695db30\" (UID: \"9f018ae8-2c5c-41f5-a7e5-be48d695db30\") " Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.927659 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f018ae8-2c5c-41f5-a7e5-be48d695db30-operator-scripts\") pod \"9f018ae8-2c5c-41f5-a7e5-be48d695db30\" (UID: \"9f018ae8-2c5c-41f5-a7e5-be48d695db30\") " Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.928828 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f018ae8-2c5c-41f5-a7e5-be48d695db30-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f018ae8-2c5c-41f5-a7e5-be48d695db30" (UID: "9f018ae8-2c5c-41f5-a7e5-be48d695db30"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:28 crc kubenswrapper[4775]: I1216 15:14:28.933568 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f018ae8-2c5c-41f5-a7e5-be48d695db30-kube-api-access-6kxr7" (OuterVolumeSpecName: "kube-api-access-6kxr7") pod "9f018ae8-2c5c-41f5-a7e5-be48d695db30" (UID: "9f018ae8-2c5c-41f5-a7e5-be48d695db30"). InnerVolumeSpecName "kube-api-access-6kxr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.029969 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kxr7\" (UniqueName: \"kubernetes.io/projected/9f018ae8-2c5c-41f5-a7e5-be48d695db30-kube-api-access-6kxr7\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.030487 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f018ae8-2c5c-41f5-a7e5-be48d695db30-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.080963 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mxzpt" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.089402 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.107184 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6a2c-account-create-update-l9k8g" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.110172 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jfprw" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.132149 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41974328-a6d3-4c9f-af51-edd0faa04b5a-operator-scripts\") pod \"41974328-a6d3-4c9f-af51-edd0faa04b5a\" (UID: \"41974328-a6d3-4c9f-af51-edd0faa04b5a\") " Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.132216 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edd66213-7818-408d-a6ec-73c6e3b39321-scripts\") pod \"edd66213-7818-408d-a6ec-73c6e3b39321\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.132241 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/edd66213-7818-408d-a6ec-73c6e3b39321-ring-data-devices\") pod \"edd66213-7818-408d-a6ec-73c6e3b39321\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.132372 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd66213-7818-408d-a6ec-73c6e3b39321-combined-ca-bundle\") pod \"edd66213-7818-408d-a6ec-73c6e3b39321\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.132404 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-997l9\" (UniqueName: \"kubernetes.io/projected/41974328-a6d3-4c9f-af51-edd0faa04b5a-kube-api-access-997l9\") pod \"41974328-a6d3-4c9f-af51-edd0faa04b5a\" (UID: \"41974328-a6d3-4c9f-af51-edd0faa04b5a\") " Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.132456 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzstv\" (UniqueName: \"kubernetes.io/projected/bc656f33-c012-450b-b263-81b4375a3d58-kube-api-access-wzstv\") pod \"bc656f33-c012-450b-b263-81b4375a3d58\" (UID: \"bc656f33-c012-450b-b263-81b4375a3d58\") " Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.132533 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/edd66213-7818-408d-a6ec-73c6e3b39321-dispersionconf\") pod \"edd66213-7818-408d-a6ec-73c6e3b39321\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.132555 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc656f33-c012-450b-b263-81b4375a3d58-operator-scripts\") pod \"bc656f33-c012-450b-b263-81b4375a3d58\" (UID: \"bc656f33-c012-450b-b263-81b4375a3d58\") " Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.132580 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/edd66213-7818-408d-a6ec-73c6e3b39321-etc-swift\") pod \"edd66213-7818-408d-a6ec-73c6e3b39321\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.132644 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzpp8\" (UniqueName: \"kubernetes.io/projected/edd66213-7818-408d-a6ec-73c6e3b39321-kube-api-access-kzpp8\") pod \"edd66213-7818-408d-a6ec-73c6e3b39321\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.132704 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/edd66213-7818-408d-a6ec-73c6e3b39321-swiftconf\") pod \"edd66213-7818-408d-a6ec-73c6e3b39321\" (UID: \"edd66213-7818-408d-a6ec-73c6e3b39321\") " Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.138079 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edd66213-7818-408d-a6ec-73c6e3b39321-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "edd66213-7818-408d-a6ec-73c6e3b39321" (UID: "edd66213-7818-408d-a6ec-73c6e3b39321"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.138110 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc656f33-c012-450b-b263-81b4375a3d58-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bc656f33-c012-450b-b263-81b4375a3d58" (UID: "bc656f33-c012-450b-b263-81b4375a3d58"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.139020 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41974328-a6d3-4c9f-af51-edd0faa04b5a-kube-api-access-997l9" (OuterVolumeSpecName: "kube-api-access-997l9") pod "41974328-a6d3-4c9f-af51-edd0faa04b5a" (UID: "41974328-a6d3-4c9f-af51-edd0faa04b5a"). InnerVolumeSpecName "kube-api-access-997l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.139506 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41974328-a6d3-4c9f-af51-edd0faa04b5a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41974328-a6d3-4c9f-af51-edd0faa04b5a" (UID: "41974328-a6d3-4c9f-af51-edd0faa04b5a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.140376 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edd66213-7818-408d-a6ec-73c6e3b39321-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "edd66213-7818-408d-a6ec-73c6e3b39321" (UID: "edd66213-7818-408d-a6ec-73c6e3b39321"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.143079 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc656f33-c012-450b-b263-81b4375a3d58-kube-api-access-wzstv" (OuterVolumeSpecName: "kube-api-access-wzstv") pod "bc656f33-c012-450b-b263-81b4375a3d58" (UID: "bc656f33-c012-450b-b263-81b4375a3d58"). InnerVolumeSpecName "kube-api-access-wzstv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.143334 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd66213-7818-408d-a6ec-73c6e3b39321-kube-api-access-kzpp8" (OuterVolumeSpecName: "kube-api-access-kzpp8") pod "edd66213-7818-408d-a6ec-73c6e3b39321" (UID: "edd66213-7818-408d-a6ec-73c6e3b39321"). InnerVolumeSpecName "kube-api-access-kzpp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.170377 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edd66213-7818-408d-a6ec-73c6e3b39321-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "edd66213-7818-408d-a6ec-73c6e3b39321" (UID: "edd66213-7818-408d-a6ec-73c6e3b39321"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.179869 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edd66213-7818-408d-a6ec-73c6e3b39321-scripts" (OuterVolumeSpecName: "scripts") pod "edd66213-7818-408d-a6ec-73c6e3b39321" (UID: "edd66213-7818-408d-a6ec-73c6e3b39321"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.187719 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edd66213-7818-408d-a6ec-73c6e3b39321-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edd66213-7818-408d-a6ec-73c6e3b39321" (UID: "edd66213-7818-408d-a6ec-73c6e3b39321"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.188572 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edd66213-7818-408d-a6ec-73c6e3b39321-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "edd66213-7818-408d-a6ec-73c6e3b39321" (UID: "edd66213-7818-408d-a6ec-73c6e3b39321"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.234869 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzjtl\" (UniqueName: \"kubernetes.io/projected/4078625a-a3c7-45f0-85e1-56c07c7b85b9-kube-api-access-xzjtl\") pod \"4078625a-a3c7-45f0-85e1-56c07c7b85b9\" (UID: \"4078625a-a3c7-45f0-85e1-56c07c7b85b9\") " Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.234975 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4078625a-a3c7-45f0-85e1-56c07c7b85b9-operator-scripts\") pod \"4078625a-a3c7-45f0-85e1-56c07c7b85b9\" (UID: \"4078625a-a3c7-45f0-85e1-56c07c7b85b9\") " Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.235624 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzstv\" (UniqueName: \"kubernetes.io/projected/bc656f33-c012-450b-b263-81b4375a3d58-kube-api-access-wzstv\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.235725 4775 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/edd66213-7818-408d-a6ec-73c6e3b39321-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.235740 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc656f33-c012-450b-b263-81b4375a3d58-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.235748 4775 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/edd66213-7818-408d-a6ec-73c6e3b39321-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.235758 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzpp8\" (UniqueName: \"kubernetes.io/projected/edd66213-7818-408d-a6ec-73c6e3b39321-kube-api-access-kzpp8\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.235784 4775 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/edd66213-7818-408d-a6ec-73c6e3b39321-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.235795 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41974328-a6d3-4c9f-af51-edd0faa04b5a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.235821 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edd66213-7818-408d-a6ec-73c6e3b39321-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.235845 4775 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/edd66213-7818-408d-a6ec-73c6e3b39321-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.235856 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd66213-7818-408d-a6ec-73c6e3b39321-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.235869 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-997l9\" (UniqueName: \"kubernetes.io/projected/41974328-a6d3-4c9f-af51-edd0faa04b5a-kube-api-access-997l9\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.236366 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4078625a-a3c7-45f0-85e1-56c07c7b85b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4078625a-a3c7-45f0-85e1-56c07c7b85b9" (UID: "4078625a-a3c7-45f0-85e1-56c07c7b85b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.240460 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4078625a-a3c7-45f0-85e1-56c07c7b85b9-kube-api-access-xzjtl" (OuterVolumeSpecName: "kube-api-access-xzjtl") pod "4078625a-a3c7-45f0-85e1-56c07c7b85b9" (UID: "4078625a-a3c7-45f0-85e1-56c07c7b85b9"). InnerVolumeSpecName "kube-api-access-xzjtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.258436 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mxzpt" event={"ID":"bc656f33-c012-450b-b263-81b4375a3d58","Type":"ContainerDied","Data":"885abbc8c6957115ab9ae65eaabe1990624805339f447fa62b7a62952ebe672f"} Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.258506 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="885abbc8c6957115ab9ae65eaabe1990624805339f447fa62b7a62952ebe672f" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.258643 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mxzpt" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.262072 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4fce-account-create-update-8h6sf" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.262093 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4fce-account-create-update-8h6sf" event={"ID":"9f018ae8-2c5c-41f5-a7e5-be48d695db30","Type":"ContainerDied","Data":"5aa35e6e4f6302f44b645a84b3b15e87275e08b99f08ce4830d9f8755daa95c1"} Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.262163 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5aa35e6e4f6302f44b645a84b3b15e87275e08b99f08ce4830d9f8755daa95c1" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.264908 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tp2tw" event={"ID":"edd66213-7818-408d-a6ec-73c6e3b39321","Type":"ContainerDied","Data":"f99c8b0fe4db2210e8ae12db5695495c7f39a11ce192119719fa2ec2c1566b68"} Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.264974 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f99c8b0fe4db2210e8ae12db5695495c7f39a11ce192119719fa2ec2c1566b68" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.265143 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tp2tw" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.268158 4775 generic.go:334] "Generic (PLEG): container finished" podID="0451a266-fe64-4e36-93f7-9ebb1e547eec" containerID="290cef72b957bb2ee1c39d2653f4f5bb1e67aa6a9764573ac57356610f089b95" exitCode=0 Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.268360 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0451a266-fe64-4e36-93f7-9ebb1e547eec","Type":"ContainerDied","Data":"290cef72b957bb2ee1c39d2653f4f5bb1e67aa6a9764573ac57356610f089b95"} Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.275025 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jfprw" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.275461 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jfprw" event={"ID":"4078625a-a3c7-45f0-85e1-56c07c7b85b9","Type":"ContainerDied","Data":"5bfa81b30a338622936e3978984b700e9efa11d219a52362380d4ee36ce23cf6"} Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.275551 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bfa81b30a338622936e3978984b700e9efa11d219a52362380d4ee36ce23cf6" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.292163 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6a2c-account-create-update-l9k8g" event={"ID":"41974328-a6d3-4c9f-af51-edd0faa04b5a","Type":"ContainerDied","Data":"715c2f60984e92b2aef232807b2ed9a4ed2eed2b794046c967ac13c6d1454206"} Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.292223 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="715c2f60984e92b2aef232807b2ed9a4ed2eed2b794046c967ac13c6d1454206" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.292267 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s9zj2" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.292307 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6a2c-account-create-update-l9k8g" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.338566 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzjtl\" (UniqueName: \"kubernetes.io/projected/4078625a-a3c7-45f0-85e1-56c07c7b85b9-kube-api-access-xzjtl\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.338623 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4078625a-a3c7-45f0-85e1-56c07c7b85b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.405182 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s9zj2"] Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.417940 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s9zj2"] Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.640999 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5251-account-create-update-fmcxk" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.686823 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rc6p4" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.748184 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5frw\" (UniqueName: \"kubernetes.io/projected/afe8ec15-4c89-4cf1-a6b7-5e3534660d7b-kube-api-access-g5frw\") pod \"afe8ec15-4c89-4cf1-a6b7-5e3534660d7b\" (UID: \"afe8ec15-4c89-4cf1-a6b7-5e3534660d7b\") " Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.748366 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afe8ec15-4c89-4cf1-a6b7-5e3534660d7b-operator-scripts\") pod \"afe8ec15-4c89-4cf1-a6b7-5e3534660d7b\" (UID: \"afe8ec15-4c89-4cf1-a6b7-5e3534660d7b\") " Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.748422 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd8sd\" (UniqueName: \"kubernetes.io/projected/735a5a7b-bafc-467b-9fa0-9a6755f9f04c-kube-api-access-zd8sd\") pod \"735a5a7b-bafc-467b-9fa0-9a6755f9f04c\" (UID: \"735a5a7b-bafc-467b-9fa0-9a6755f9f04c\") " Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.748485 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/735a5a7b-bafc-467b-9fa0-9a6755f9f04c-operator-scripts\") pod \"735a5a7b-bafc-467b-9fa0-9a6755f9f04c\" (UID: \"735a5a7b-bafc-467b-9fa0-9a6755f9f04c\") " Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.748939 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afe8ec15-4c89-4cf1-a6b7-5e3534660d7b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "afe8ec15-4c89-4cf1-a6b7-5e3534660d7b" (UID: "afe8ec15-4c89-4cf1-a6b7-5e3534660d7b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.749234 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/735a5a7b-bafc-467b-9fa0-9a6755f9f04c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "735a5a7b-bafc-467b-9fa0-9a6755f9f04c" (UID: "735a5a7b-bafc-467b-9fa0-9a6755f9f04c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.753823 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afe8ec15-4c89-4cf1-a6b7-5e3534660d7b-kube-api-access-g5frw" (OuterVolumeSpecName: "kube-api-access-g5frw") pod "afe8ec15-4c89-4cf1-a6b7-5e3534660d7b" (UID: "afe8ec15-4c89-4cf1-a6b7-5e3534660d7b"). InnerVolumeSpecName "kube-api-access-g5frw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.757121 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/735a5a7b-bafc-467b-9fa0-9a6755f9f04c-kube-api-access-zd8sd" (OuterVolumeSpecName: "kube-api-access-zd8sd") pod "735a5a7b-bafc-467b-9fa0-9a6755f9f04c" (UID: "735a5a7b-bafc-467b-9fa0-9a6755f9f04c"). InnerVolumeSpecName "kube-api-access-zd8sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.850998 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afe8ec15-4c89-4cf1-a6b7-5e3534660d7b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.851033 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd8sd\" (UniqueName: \"kubernetes.io/projected/735a5a7b-bafc-467b-9fa0-9a6755f9f04c-kube-api-access-zd8sd\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.851045 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/735a5a7b-bafc-467b-9fa0-9a6755f9f04c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:29 crc kubenswrapper[4775]: I1216 15:14:29.851054 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5frw\" (UniqueName: \"kubernetes.io/projected/afe8ec15-4c89-4cf1-a6b7-5e3534660d7b-kube-api-access-g5frw\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:30 crc kubenswrapper[4775]: I1216 15:14:30.303607 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0451a266-fe64-4e36-93f7-9ebb1e547eec","Type":"ContainerStarted","Data":"bd14a367b3db152d794c63c2f4462c459b195bb462deb73243ee7ad2ca5594db"} Dec 16 15:14:30 crc kubenswrapper[4775]: I1216 15:14:30.303912 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:14:30 crc kubenswrapper[4775]: I1216 15:14:30.307419 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5251-account-create-update-fmcxk" Dec 16 15:14:30 crc kubenswrapper[4775]: I1216 15:14:30.307414 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5251-account-create-update-fmcxk" event={"ID":"735a5a7b-bafc-467b-9fa0-9a6755f9f04c","Type":"ContainerDied","Data":"286b43cff16d3ab2f58f13dcd2341ba27cf3c8975a0ae8f7c56abf733c84da61"} Dec 16 15:14:30 crc kubenswrapper[4775]: I1216 15:14:30.307475 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="286b43cff16d3ab2f58f13dcd2341ba27cf3c8975a0ae8f7c56abf733c84da61" Dec 16 15:14:30 crc kubenswrapper[4775]: I1216 15:14:30.309187 4775 generic.go:334] "Generic (PLEG): container finished" podID="79fbce0a-9f2b-4548-b886-de6dfe5ff245" containerID="790a4a60bbabbe93361bb85ff6f9a1546bd650f527fc5aeb455b91ee31cccce3" exitCode=0 Dec 16 15:14:30 crc kubenswrapper[4775]: I1216 15:14:30.309254 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"79fbce0a-9f2b-4548-b886-de6dfe5ff245","Type":"ContainerDied","Data":"790a4a60bbabbe93361bb85ff6f9a1546bd650f527fc5aeb455b91ee31cccce3"} Dec 16 15:14:30 crc kubenswrapper[4775]: I1216 15:14:30.311406 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rc6p4" event={"ID":"afe8ec15-4c89-4cf1-a6b7-5e3534660d7b","Type":"ContainerDied","Data":"6367fa6c73972162901c034f779d26124dac048e835fba085ba7b8344e0b6de3"} Dec 16 15:14:30 crc kubenswrapper[4775]: I1216 15:14:30.311450 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6367fa6c73972162901c034f779d26124dac048e835fba085ba7b8344e0b6de3" Dec 16 15:14:30 crc kubenswrapper[4775]: I1216 15:14:30.311520 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rc6p4" Dec 16 15:14:30 crc kubenswrapper[4775]: I1216 15:14:30.338412 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=53.254899885 podStartE2EDuration="1m0.338393543s" podCreationTimestamp="2025-12-16 15:13:30 +0000 UTC" firstStartedPulling="2025-12-16 15:13:48.570660704 +0000 UTC m=+1153.521739627" lastFinishedPulling="2025-12-16 15:13:55.654154362 +0000 UTC m=+1160.605233285" observedRunningTime="2025-12-16 15:14:30.334922104 +0000 UTC m=+1195.286001027" watchObservedRunningTime="2025-12-16 15:14:30.338393543 +0000 UTC m=+1195.289472466" Dec 16 15:14:30 crc kubenswrapper[4775]: I1216 15:14:30.999260 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-7lsw2"] Dec 16 15:14:31 crc kubenswrapper[4775]: E1216 15:14:31.000064 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f018ae8-2c5c-41f5-a7e5-be48d695db30" containerName="mariadb-account-create-update" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.000086 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f018ae8-2c5c-41f5-a7e5-be48d695db30" containerName="mariadb-account-create-update" Dec 16 15:14:31 crc kubenswrapper[4775]: E1216 15:14:31.000106 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afe8ec15-4c89-4cf1-a6b7-5e3534660d7b" containerName="mariadb-database-create" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.000113 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe8ec15-4c89-4cf1-a6b7-5e3534660d7b" containerName="mariadb-database-create" Dec 16 15:14:31 crc kubenswrapper[4775]: E1216 15:14:31.000124 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41974328-a6d3-4c9f-af51-edd0faa04b5a" containerName="mariadb-account-create-update" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.000130 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="41974328-a6d3-4c9f-af51-edd0faa04b5a" containerName="mariadb-account-create-update" Dec 16 15:14:31 crc kubenswrapper[4775]: E1216 15:14:31.000145 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd66213-7818-408d-a6ec-73c6e3b39321" containerName="swift-ring-rebalance" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.000151 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd66213-7818-408d-a6ec-73c6e3b39321" containerName="swift-ring-rebalance" Dec 16 15:14:31 crc kubenswrapper[4775]: E1216 15:14:31.000163 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="735a5a7b-bafc-467b-9fa0-9a6755f9f04c" containerName="mariadb-account-create-update" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.000169 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="735a5a7b-bafc-467b-9fa0-9a6755f9f04c" containerName="mariadb-account-create-update" Dec 16 15:14:31 crc kubenswrapper[4775]: E1216 15:14:31.000181 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06fb4931-8386-4c6f-86c6-2cb5c0a323f0" containerName="dnsmasq-dns" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.000187 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="06fb4931-8386-4c6f-86c6-2cb5c0a323f0" containerName="dnsmasq-dns" Dec 16 15:14:31 crc kubenswrapper[4775]: E1216 15:14:31.000196 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06fb4931-8386-4c6f-86c6-2cb5c0a323f0" containerName="init" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.000202 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="06fb4931-8386-4c6f-86c6-2cb5c0a323f0" containerName="init" Dec 16 15:14:31 crc kubenswrapper[4775]: E1216 15:14:31.000214 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4078625a-a3c7-45f0-85e1-56c07c7b85b9" containerName="mariadb-database-create" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.000220 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4078625a-a3c7-45f0-85e1-56c07c7b85b9" containerName="mariadb-database-create" Dec 16 15:14:31 crc kubenswrapper[4775]: E1216 15:14:31.000233 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc656f33-c012-450b-b263-81b4375a3d58" containerName="mariadb-database-create" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.000239 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc656f33-c012-450b-b263-81b4375a3d58" containerName="mariadb-database-create" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.000387 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="afe8ec15-4c89-4cf1-a6b7-5e3534660d7b" containerName="mariadb-database-create" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.000397 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="4078625a-a3c7-45f0-85e1-56c07c7b85b9" containerName="mariadb-database-create" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.000406 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f018ae8-2c5c-41f5-a7e5-be48d695db30" containerName="mariadb-account-create-update" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.000413 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="06fb4931-8386-4c6f-86c6-2cb5c0a323f0" containerName="dnsmasq-dns" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.000420 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc656f33-c012-450b-b263-81b4375a3d58" containerName="mariadb-database-create" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.000425 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="735a5a7b-bafc-467b-9fa0-9a6755f9f04c" containerName="mariadb-account-create-update" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.000433 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="edd66213-7818-408d-a6ec-73c6e3b39321" containerName="swift-ring-rebalance" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.000445 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="41974328-a6d3-4c9f-af51-edd0faa04b5a" containerName="mariadb-account-create-update" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.000974 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7lsw2" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.002835 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.004531 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lzhbs" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.015478 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7lsw2"] Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.175994 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5vpd\" (UniqueName: \"kubernetes.io/projected/a8de491d-4c4f-44bc-82d5-7d571b4920e8-kube-api-access-p5vpd\") pod \"glance-db-sync-7lsw2\" (UID: \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\") " pod="openstack/glance-db-sync-7lsw2" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.176337 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8de491d-4c4f-44bc-82d5-7d571b4920e8-db-sync-config-data\") pod \"glance-db-sync-7lsw2\" (UID: \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\") " pod="openstack/glance-db-sync-7lsw2" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.176415 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8de491d-4c4f-44bc-82d5-7d571b4920e8-config-data\") pod \"glance-db-sync-7lsw2\" (UID: \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\") " pod="openstack/glance-db-sync-7lsw2" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.176535 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8de491d-4c4f-44bc-82d5-7d571b4920e8-combined-ca-bundle\") pod \"glance-db-sync-7lsw2\" (UID: \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\") " pod="openstack/glance-db-sync-7lsw2" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.278840 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5vpd\" (UniqueName: \"kubernetes.io/projected/a8de491d-4c4f-44bc-82d5-7d571b4920e8-kube-api-access-p5vpd\") pod \"glance-db-sync-7lsw2\" (UID: \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\") " pod="openstack/glance-db-sync-7lsw2" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.279013 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8de491d-4c4f-44bc-82d5-7d571b4920e8-db-sync-config-data\") pod \"glance-db-sync-7lsw2\" (UID: \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\") " pod="openstack/glance-db-sync-7lsw2" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.279088 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8de491d-4c4f-44bc-82d5-7d571b4920e8-config-data\") pod \"glance-db-sync-7lsw2\" (UID: \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\") " pod="openstack/glance-db-sync-7lsw2" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.279138 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8de491d-4c4f-44bc-82d5-7d571b4920e8-combined-ca-bundle\") pod \"glance-db-sync-7lsw2\" (UID: \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\") " pod="openstack/glance-db-sync-7lsw2" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.286961 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8de491d-4c4f-44bc-82d5-7d571b4920e8-combined-ca-bundle\") pod \"glance-db-sync-7lsw2\" (UID: \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\") " pod="openstack/glance-db-sync-7lsw2" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.287076 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8de491d-4c4f-44bc-82d5-7d571b4920e8-db-sync-config-data\") pod \"glance-db-sync-7lsw2\" (UID: \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\") " pod="openstack/glance-db-sync-7lsw2" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.287629 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8de491d-4c4f-44bc-82d5-7d571b4920e8-config-data\") pod \"glance-db-sync-7lsw2\" (UID: \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\") " pod="openstack/glance-db-sync-7lsw2" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.298225 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5vpd\" (UniqueName: \"kubernetes.io/projected/a8de491d-4c4f-44bc-82d5-7d571b4920e8-kube-api-access-p5vpd\") pod \"glance-db-sync-7lsw2\" (UID: \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\") " pod="openstack/glance-db-sync-7lsw2" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.321510 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7lsw2" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.327860 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"79fbce0a-9f2b-4548-b886-de6dfe5ff245","Type":"ContainerStarted","Data":"a4159eb3eaf7d3182fc730fa78a7925b1f4d365d729ae0261021a1d8812d6dc5"} Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.328994 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.355130 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06fb4931-8386-4c6f-86c6-2cb5c0a323f0" path="/var/lib/kubelet/pods/06fb4931-8386-4c6f-86c6-2cb5c0a323f0/volumes" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.358338 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=53.546793756 podStartE2EDuration="1m1.35831329s" podCreationTimestamp="2025-12-16 15:13:30 +0000 UTC" firstStartedPulling="2025-12-16 15:13:47.924469157 +0000 UTC m=+1152.875548100" lastFinishedPulling="2025-12-16 15:13:55.735988711 +0000 UTC m=+1160.687067634" observedRunningTime="2025-12-16 15:14:31.356118621 +0000 UTC m=+1196.307197544" watchObservedRunningTime="2025-12-16 15:14:31.35831329 +0000 UTC m=+1196.309392213" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.506321 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rkmmt" podUID="b560f177-aa8d-4722-92bd-4ef2755caab0" containerName="ovn-controller" probeResult="failure" output=< Dec 16 15:14:31 crc kubenswrapper[4775]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 16 15:14:31 crc kubenswrapper[4775]: > Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.544813 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.570206 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-c5f9m" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.819457 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rkmmt-config-njqsv"] Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.828109 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.831242 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 16 15:14:31 crc kubenswrapper[4775]: I1216 15:14:31.834768 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rkmmt-config-njqsv"] Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.003902 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5c9e31d6-f45a-493b-89db-b597e44c4269-var-run\") pod \"ovn-controller-rkmmt-config-njqsv\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.004494 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5c9e31d6-f45a-493b-89db-b597e44c4269-additional-scripts\") pod \"ovn-controller-rkmmt-config-njqsv\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.004553 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9e31d6-f45a-493b-89db-b597e44c4269-var-run-ovn\") pod \"ovn-controller-rkmmt-config-njqsv\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.004608 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65nxq\" (UniqueName: \"kubernetes.io/projected/5c9e31d6-f45a-493b-89db-b597e44c4269-kube-api-access-65nxq\") pod \"ovn-controller-rkmmt-config-njqsv\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.004657 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c9e31d6-f45a-493b-89db-b597e44c4269-scripts\") pod \"ovn-controller-rkmmt-config-njqsv\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.004708 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9e31d6-f45a-493b-89db-b597e44c4269-var-log-ovn\") pod \"ovn-controller-rkmmt-config-njqsv\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.004079 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7lsw2"] Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.106488 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5c9e31d6-f45a-493b-89db-b597e44c4269-additional-scripts\") pod \"ovn-controller-rkmmt-config-njqsv\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.107286 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9e31d6-f45a-493b-89db-b597e44c4269-var-run-ovn\") pod \"ovn-controller-rkmmt-config-njqsv\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.107441 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65nxq\" (UniqueName: \"kubernetes.io/projected/5c9e31d6-f45a-493b-89db-b597e44c4269-kube-api-access-65nxq\") pod \"ovn-controller-rkmmt-config-njqsv\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.107727 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c9e31d6-f45a-493b-89db-b597e44c4269-scripts\") pod \"ovn-controller-rkmmt-config-njqsv\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.107475 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5c9e31d6-f45a-493b-89db-b597e44c4269-additional-scripts\") pod \"ovn-controller-rkmmt-config-njqsv\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.108075 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9e31d6-f45a-493b-89db-b597e44c4269-var-run-ovn\") pod \"ovn-controller-rkmmt-config-njqsv\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.107933 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9e31d6-f45a-493b-89db-b597e44c4269-var-log-ovn\") pod \"ovn-controller-rkmmt-config-njqsv\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.108231 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5c9e31d6-f45a-493b-89db-b597e44c4269-var-run\") pod \"ovn-controller-rkmmt-config-njqsv\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.108318 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9e31d6-f45a-493b-89db-b597e44c4269-var-log-ovn\") pod \"ovn-controller-rkmmt-config-njqsv\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.108445 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5c9e31d6-f45a-493b-89db-b597e44c4269-var-run\") pod \"ovn-controller-rkmmt-config-njqsv\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.109668 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c9e31d6-f45a-493b-89db-b597e44c4269-scripts\") pod \"ovn-controller-rkmmt-config-njqsv\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.131495 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65nxq\" (UniqueName: \"kubernetes.io/projected/5c9e31d6-f45a-493b-89db-b597e44c4269-kube-api-access-65nxq\") pod \"ovn-controller-rkmmt-config-njqsv\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.149261 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.341411 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7lsw2" event={"ID":"a8de491d-4c4f-44bc-82d5-7d571b4920e8","Type":"ContainerStarted","Data":"8bd9c4acff079d8f054f40e1fa3fc09a6b3f4a98336a821acf71b21ff534bcae"} Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.699921 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rkmmt-config-njqsv"] Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.868697 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:14:32 crc kubenswrapper[4775]: I1216 15:14:32.869202 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:14:33 crc kubenswrapper[4775]: I1216 15:14:33.351130 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rkmmt-config-njqsv" event={"ID":"5c9e31d6-f45a-493b-89db-b597e44c4269","Type":"ContainerStarted","Data":"d84554a5c40da4428974694dcc9e3f561c5cfade6fd51ad035f0b93b7ca513c5"} Dec 16 15:14:33 crc kubenswrapper[4775]: I1216 15:14:33.351199 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rkmmt-config-njqsv" event={"ID":"5c9e31d6-f45a-493b-89db-b597e44c4269","Type":"ContainerStarted","Data":"5772b4ed8958c42263701bf5f989b86cd2feea0097cd2b0421f9bdb7874eb8a8"} Dec 16 15:14:33 crc kubenswrapper[4775]: I1216 15:14:33.374604 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-rkmmt-config-njqsv" podStartSLOduration=2.374585672 podStartE2EDuration="2.374585672s" podCreationTimestamp="2025-12-16 15:14:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:14:33.369362687 +0000 UTC m=+1198.320441610" watchObservedRunningTime="2025-12-16 15:14:33.374585672 +0000 UTC m=+1198.325664585" Dec 16 15:14:34 crc kubenswrapper[4775]: I1216 15:14:34.360676 4775 generic.go:334] "Generic (PLEG): container finished" podID="5c9e31d6-f45a-493b-89db-b597e44c4269" containerID="d84554a5c40da4428974694dcc9e3f561c5cfade6fd51ad035f0b93b7ca513c5" exitCode=0 Dec 16 15:14:34 crc kubenswrapper[4775]: I1216 15:14:34.361084 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rkmmt-config-njqsv" event={"ID":"5c9e31d6-f45a-493b-89db-b597e44c4269","Type":"ContainerDied","Data":"d84554a5c40da4428974694dcc9e3f561c5cfade6fd51ad035f0b93b7ca513c5"} Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.799538 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.887078 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9e31d6-f45a-493b-89db-b597e44c4269-var-log-ovn\") pod \"5c9e31d6-f45a-493b-89db-b597e44c4269\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.887143 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5c9e31d6-f45a-493b-89db-b597e44c4269-var-run\") pod \"5c9e31d6-f45a-493b-89db-b597e44c4269\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.887214 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9e31d6-f45a-493b-89db-b597e44c4269-var-run-ovn\") pod \"5c9e31d6-f45a-493b-89db-b597e44c4269\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.887258 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c9e31d6-f45a-493b-89db-b597e44c4269-var-run" (OuterVolumeSpecName: "var-run") pod "5c9e31d6-f45a-493b-89db-b597e44c4269" (UID: "5c9e31d6-f45a-493b-89db-b597e44c4269"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.887347 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c9e31d6-f45a-493b-89db-b597e44c4269-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5c9e31d6-f45a-493b-89db-b597e44c4269" (UID: "5c9e31d6-f45a-493b-89db-b597e44c4269"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.887376 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5c9e31d6-f45a-493b-89db-b597e44c4269-additional-scripts\") pod \"5c9e31d6-f45a-493b-89db-b597e44c4269\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.887468 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c9e31d6-f45a-493b-89db-b597e44c4269-scripts\") pod \"5c9e31d6-f45a-493b-89db-b597e44c4269\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.887519 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65nxq\" (UniqueName: \"kubernetes.io/projected/5c9e31d6-f45a-493b-89db-b597e44c4269-kube-api-access-65nxq\") pod \"5c9e31d6-f45a-493b-89db-b597e44c4269\" (UID: \"5c9e31d6-f45a-493b-89db-b597e44c4269\") " Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.887938 4775 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5c9e31d6-f45a-493b-89db-b597e44c4269-var-run\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.887951 4775 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9e31d6-f45a-493b-89db-b597e44c4269-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.888187 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c9e31d6-f45a-493b-89db-b597e44c4269-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "5c9e31d6-f45a-493b-89db-b597e44c4269" (UID: "5c9e31d6-f45a-493b-89db-b597e44c4269"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.888260 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c9e31d6-f45a-493b-89db-b597e44c4269-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5c9e31d6-f45a-493b-89db-b597e44c4269" (UID: "5c9e31d6-f45a-493b-89db-b597e44c4269"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.888823 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c9e31d6-f45a-493b-89db-b597e44c4269-scripts" (OuterVolumeSpecName: "scripts") pod "5c9e31d6-f45a-493b-89db-b597e44c4269" (UID: "5c9e31d6-f45a-493b-89db-b597e44c4269"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.895147 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c9e31d6-f45a-493b-89db-b597e44c4269-kube-api-access-65nxq" (OuterVolumeSpecName: "kube-api-access-65nxq") pod "5c9e31d6-f45a-493b-89db-b597e44c4269" (UID: "5c9e31d6-f45a-493b-89db-b597e44c4269"). InnerVolumeSpecName "kube-api-access-65nxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.990380 4775 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5c9e31d6-f45a-493b-89db-b597e44c4269-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.990449 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c9e31d6-f45a-493b-89db-b597e44c4269-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.990465 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65nxq\" (UniqueName: \"kubernetes.io/projected/5c9e31d6-f45a-493b-89db-b597e44c4269-kube-api-access-65nxq\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:35 crc kubenswrapper[4775]: I1216 15:14:35.990480 4775 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9e31d6-f45a-493b-89db-b597e44c4269-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:36 crc kubenswrapper[4775]: I1216 15:14:36.386828 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rkmmt-config-njqsv" event={"ID":"5c9e31d6-f45a-493b-89db-b597e44c4269","Type":"ContainerDied","Data":"5772b4ed8958c42263701bf5f989b86cd2feea0097cd2b0421f9bdb7874eb8a8"} Dec 16 15:14:36 crc kubenswrapper[4775]: I1216 15:14:36.386910 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5772b4ed8958c42263701bf5f989b86cd2feea0097cd2b0421f9bdb7874eb8a8" Dec 16 15:14:36 crc kubenswrapper[4775]: I1216 15:14:36.386960 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rkmmt-config-njqsv" Dec 16 15:14:36 crc kubenswrapper[4775]: I1216 15:14:36.494906 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-rkmmt-config-njqsv"] Dec 16 15:14:36 crc kubenswrapper[4775]: I1216 15:14:36.508404 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-rkmmt-config-njqsv"] Dec 16 15:14:36 crc kubenswrapper[4775]: I1216 15:14:36.511422 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-rkmmt" Dec 16 15:14:37 crc kubenswrapper[4775]: I1216 15:14:37.356248 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c9e31d6-f45a-493b-89db-b597e44c4269" path="/var/lib/kubelet/pods/5c9e31d6-f45a-493b-89db-b597e44c4269/volumes" Dec 16 15:14:40 crc kubenswrapper[4775]: I1216 15:14:40.488934 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:40 crc kubenswrapper[4775]: I1216 15:14:40.498938 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8b23fde4-e483-4825-969c-94ebc8396511-etc-swift\") pod \"swift-storage-0\" (UID: \"8b23fde4-e483-4825-969c-94ebc8396511\") " pod="openstack/swift-storage-0" Dec 16 15:14:40 crc kubenswrapper[4775]: I1216 15:14:40.651546 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.005208 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.353204 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.480975 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-aacd-account-create-update-swcfz"] Dec 16 15:14:42 crc kubenswrapper[4775]: E1216 15:14:42.481582 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c9e31d6-f45a-493b-89db-b597e44c4269" containerName="ovn-config" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.481605 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9e31d6-f45a-493b-89db-b597e44c4269" containerName="ovn-config" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.481814 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c9e31d6-f45a-493b-89db-b597e44c4269" containerName="ovn-config" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.482550 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-aacd-account-create-update-swcfz" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.485977 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.507173 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-sxkbc"] Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.512479 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sxkbc" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.537859 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-aacd-account-create-update-swcfz"] Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.551670 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba3c5008-fead-4517-9aed-d02d07560a0a-operator-scripts\") pod \"cinder-aacd-account-create-update-swcfz\" (UID: \"ba3c5008-fead-4517-9aed-d02d07560a0a\") " pod="openstack/cinder-aacd-account-create-update-swcfz" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.551768 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck72q\" (UniqueName: \"kubernetes.io/projected/ba3c5008-fead-4517-9aed-d02d07560a0a-kube-api-access-ck72q\") pod \"cinder-aacd-account-create-update-swcfz\" (UID: \"ba3c5008-fead-4517-9aed-d02d07560a0a\") " pod="openstack/cinder-aacd-account-create-update-swcfz" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.553361 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-sxkbc"] Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.651275 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-tj4s6"] Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.652628 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tj4s6" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.654327 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck72q\" (UniqueName: \"kubernetes.io/projected/ba3c5008-fead-4517-9aed-d02d07560a0a-kube-api-access-ck72q\") pod \"cinder-aacd-account-create-update-swcfz\" (UID: \"ba3c5008-fead-4517-9aed-d02d07560a0a\") " pod="openstack/cinder-aacd-account-create-update-swcfz" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.654413 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jmdv\" (UniqueName: \"kubernetes.io/projected/8c404df9-a26b-44f1-bd1a-be3f6877f896-kube-api-access-7jmdv\") pod \"barbican-db-create-sxkbc\" (UID: \"8c404df9-a26b-44f1-bd1a-be3f6877f896\") " pod="openstack/barbican-db-create-sxkbc" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.654457 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c404df9-a26b-44f1-bd1a-be3f6877f896-operator-scripts\") pod \"barbican-db-create-sxkbc\" (UID: \"8c404df9-a26b-44f1-bd1a-be3f6877f896\") " pod="openstack/barbican-db-create-sxkbc" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.654475 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba3c5008-fead-4517-9aed-d02d07560a0a-operator-scripts\") pod \"cinder-aacd-account-create-update-swcfz\" (UID: \"ba3c5008-fead-4517-9aed-d02d07560a0a\") " pod="openstack/cinder-aacd-account-create-update-swcfz" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.655206 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba3c5008-fead-4517-9aed-d02d07560a0a-operator-scripts\") pod \"cinder-aacd-account-create-update-swcfz\" (UID: \"ba3c5008-fead-4517-9aed-d02d07560a0a\") " pod="openstack/cinder-aacd-account-create-update-swcfz" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.664111 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-bb88-account-create-update-tdzmc"] Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.667225 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bb88-account-create-update-tdzmc" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.681059 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.683384 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tj4s6"] Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.702966 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck72q\" (UniqueName: \"kubernetes.io/projected/ba3c5008-fead-4517-9aed-d02d07560a0a-kube-api-access-ck72q\") pod \"cinder-aacd-account-create-update-swcfz\" (UID: \"ba3c5008-fead-4517-9aed-d02d07560a0a\") " pod="openstack/cinder-aacd-account-create-update-swcfz" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.711318 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-bb88-account-create-update-tdzmc"] Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.756376 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e1160f7-8d33-49ce-a91c-7c4e95335f49-operator-scripts\") pod \"cinder-db-create-tj4s6\" (UID: \"8e1160f7-8d33-49ce-a91c-7c4e95335f49\") " pod="openstack/cinder-db-create-tj4s6" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.756705 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l22l9\" (UniqueName: \"kubernetes.io/projected/8e1160f7-8d33-49ce-a91c-7c4e95335f49-kube-api-access-l22l9\") pod \"cinder-db-create-tj4s6\" (UID: \"8e1160f7-8d33-49ce-a91c-7c4e95335f49\") " pod="openstack/cinder-db-create-tj4s6" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.756816 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24nss\" (UniqueName: \"kubernetes.io/projected/7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f-kube-api-access-24nss\") pod \"barbican-bb88-account-create-update-tdzmc\" (UID: \"7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f\") " pod="openstack/barbican-bb88-account-create-update-tdzmc" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.756988 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jmdv\" (UniqueName: \"kubernetes.io/projected/8c404df9-a26b-44f1-bd1a-be3f6877f896-kube-api-access-7jmdv\") pod \"barbican-db-create-sxkbc\" (UID: \"8c404df9-a26b-44f1-bd1a-be3f6877f896\") " pod="openstack/barbican-db-create-sxkbc" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.757097 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c404df9-a26b-44f1-bd1a-be3f6877f896-operator-scripts\") pod \"barbican-db-create-sxkbc\" (UID: \"8c404df9-a26b-44f1-bd1a-be3f6877f896\") " pod="openstack/barbican-db-create-sxkbc" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.757179 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f-operator-scripts\") pod \"barbican-bb88-account-create-update-tdzmc\" (UID: \"7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f\") " pod="openstack/barbican-bb88-account-create-update-tdzmc" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.758372 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c404df9-a26b-44f1-bd1a-be3f6877f896-operator-scripts\") pod \"barbican-db-create-sxkbc\" (UID: \"8c404df9-a26b-44f1-bd1a-be3f6877f896\") " pod="openstack/barbican-db-create-sxkbc" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.777311 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jmdv\" (UniqueName: \"kubernetes.io/projected/8c404df9-a26b-44f1-bd1a-be3f6877f896-kube-api-access-7jmdv\") pod \"barbican-db-create-sxkbc\" (UID: \"8c404df9-a26b-44f1-bd1a-be3f6877f896\") " pod="openstack/barbican-db-create-sxkbc" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.788834 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-q52vm"] Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.790506 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q52vm" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.797296 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.798752 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-q52vm"] Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.800668 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.800940 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rsk8n" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.801171 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.835783 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-lx2gs"] Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.838413 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-lx2gs" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.848691 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-aacd-account-create-update-swcfz" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.853458 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-lx2gs"] Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.859175 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb36c52-8730-43d7-a18e-6f1e3d8312ff-config-data\") pod \"keystone-db-sync-q52vm\" (UID: \"aeb36c52-8730-43d7-a18e-6f1e3d8312ff\") " pod="openstack/keystone-db-sync-q52vm" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.862075 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e1160f7-8d33-49ce-a91c-7c4e95335f49-operator-scripts\") pod \"cinder-db-create-tj4s6\" (UID: \"8e1160f7-8d33-49ce-a91c-7c4e95335f49\") " pod="openstack/cinder-db-create-tj4s6" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.862117 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l22l9\" (UniqueName: \"kubernetes.io/projected/8e1160f7-8d33-49ce-a91c-7c4e95335f49-kube-api-access-l22l9\") pod \"cinder-db-create-tj4s6\" (UID: \"8e1160f7-8d33-49ce-a91c-7c4e95335f49\") " pod="openstack/cinder-db-create-tj4s6" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.862137 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24nss\" (UniqueName: \"kubernetes.io/projected/7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f-kube-api-access-24nss\") pod \"barbican-bb88-account-create-update-tdzmc\" (UID: \"7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f\") " pod="openstack/barbican-bb88-account-create-update-tdzmc" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.862178 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbnmx\" (UniqueName: \"kubernetes.io/projected/aeb36c52-8730-43d7-a18e-6f1e3d8312ff-kube-api-access-rbnmx\") pod \"keystone-db-sync-q52vm\" (UID: \"aeb36c52-8730-43d7-a18e-6f1e3d8312ff\") " pod="openstack/keystone-db-sync-q52vm" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.862255 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f-operator-scripts\") pod \"barbican-bb88-account-create-update-tdzmc\" (UID: \"7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f\") " pod="openstack/barbican-bb88-account-create-update-tdzmc" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.862371 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb36c52-8730-43d7-a18e-6f1e3d8312ff-combined-ca-bundle\") pod \"keystone-db-sync-q52vm\" (UID: \"aeb36c52-8730-43d7-a18e-6f1e3d8312ff\") " pod="openstack/keystone-db-sync-q52vm" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.863101 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e1160f7-8d33-49ce-a91c-7c4e95335f49-operator-scripts\") pod \"cinder-db-create-tj4s6\" (UID: \"8e1160f7-8d33-49ce-a91c-7c4e95335f49\") " pod="openstack/cinder-db-create-tj4s6" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.863909 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f-operator-scripts\") pod \"barbican-bb88-account-create-update-tdzmc\" (UID: \"7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f\") " pod="openstack/barbican-bb88-account-create-update-tdzmc" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.881468 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24nss\" (UniqueName: \"kubernetes.io/projected/7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f-kube-api-access-24nss\") pod \"barbican-bb88-account-create-update-tdzmc\" (UID: \"7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f\") " pod="openstack/barbican-bb88-account-create-update-tdzmc" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.883198 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sxkbc" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.884472 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l22l9\" (UniqueName: \"kubernetes.io/projected/8e1160f7-8d33-49ce-a91c-7c4e95335f49-kube-api-access-l22l9\") pod \"cinder-db-create-tj4s6\" (UID: \"8e1160f7-8d33-49ce-a91c-7c4e95335f49\") " pod="openstack/cinder-db-create-tj4s6" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.931088 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-7skc8"] Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.934178 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7skc8" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.944003 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7302-account-create-update-n2t2m"] Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.945263 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7302-account-create-update-n2t2m" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.954991 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7skc8"] Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.958392 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.964283 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb36c52-8730-43d7-a18e-6f1e3d8312ff-config-data\") pod \"keystone-db-sync-q52vm\" (UID: \"aeb36c52-8730-43d7-a18e-6f1e3d8312ff\") " pod="openstack/keystone-db-sync-q52vm" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.964342 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44vhr\" (UniqueName: \"kubernetes.io/projected/7aa308ba-b5b7-4acb-9744-0254560e0a1f-kube-api-access-44vhr\") pod \"heat-db-create-lx2gs\" (UID: \"7aa308ba-b5b7-4acb-9744-0254560e0a1f\") " pod="openstack/heat-db-create-lx2gs" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.964405 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbnmx\" (UniqueName: \"kubernetes.io/projected/aeb36c52-8730-43d7-a18e-6f1e3d8312ff-kube-api-access-rbnmx\") pod \"keystone-db-sync-q52vm\" (UID: \"aeb36c52-8730-43d7-a18e-6f1e3d8312ff\") " pod="openstack/keystone-db-sync-q52vm" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.964640 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7aa308ba-b5b7-4acb-9744-0254560e0a1f-operator-scripts\") pod \"heat-db-create-lx2gs\" (UID: \"7aa308ba-b5b7-4acb-9744-0254560e0a1f\") " pod="openstack/heat-db-create-lx2gs" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.965002 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb36c52-8730-43d7-a18e-6f1e3d8312ff-combined-ca-bundle\") pod \"keystone-db-sync-q52vm\" (UID: \"aeb36c52-8730-43d7-a18e-6f1e3d8312ff\") " pod="openstack/keystone-db-sync-q52vm" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.968406 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb36c52-8730-43d7-a18e-6f1e3d8312ff-config-data\") pod \"keystone-db-sync-q52vm\" (UID: \"aeb36c52-8730-43d7-a18e-6f1e3d8312ff\") " pod="openstack/keystone-db-sync-q52vm" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.971650 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb36c52-8730-43d7-a18e-6f1e3d8312ff-combined-ca-bundle\") pod \"keystone-db-sync-q52vm\" (UID: \"aeb36c52-8730-43d7-a18e-6f1e3d8312ff\") " pod="openstack/keystone-db-sync-q52vm" Dec 16 15:14:42 crc kubenswrapper[4775]: I1216 15:14:42.983181 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7302-account-create-update-n2t2m"] Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.000790 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbnmx\" (UniqueName: \"kubernetes.io/projected/aeb36c52-8730-43d7-a18e-6f1e3d8312ff-kube-api-access-rbnmx\") pod \"keystone-db-sync-q52vm\" (UID: \"aeb36c52-8730-43d7-a18e-6f1e3d8312ff\") " pod="openstack/keystone-db-sync-q52vm" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.032190 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tj4s6" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.051436 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-4908-account-create-update-plzkb"] Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.053243 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4908-account-create-update-plzkb" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.053804 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bb88-account-create-update-tdzmc" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.055725 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.060554 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-4908-account-create-update-plzkb"] Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.068868 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7aa308ba-b5b7-4acb-9744-0254560e0a1f-operator-scripts\") pod \"heat-db-create-lx2gs\" (UID: \"7aa308ba-b5b7-4acb-9744-0254560e0a1f\") " pod="openstack/heat-db-create-lx2gs" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.069157 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vd95\" (UniqueName: \"kubernetes.io/projected/42d94310-d877-4e8d-b540-6526c1c26626-kube-api-access-4vd95\") pod \"neutron-7302-account-create-update-n2t2m\" (UID: \"42d94310-d877-4e8d-b540-6526c1c26626\") " pod="openstack/neutron-7302-account-create-update-n2t2m" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.069206 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a46193a-2b19-4e7e-b0fe-090319010bba-operator-scripts\") pod \"neutron-db-create-7skc8\" (UID: \"6a46193a-2b19-4e7e-b0fe-090319010bba\") " pod="openstack/neutron-db-create-7skc8" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.069256 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42d94310-d877-4e8d-b540-6526c1c26626-operator-scripts\") pod \"neutron-7302-account-create-update-n2t2m\" (UID: \"42d94310-d877-4e8d-b540-6526c1c26626\") " pod="openstack/neutron-7302-account-create-update-n2t2m" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.069315 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44vhr\" (UniqueName: \"kubernetes.io/projected/7aa308ba-b5b7-4acb-9744-0254560e0a1f-kube-api-access-44vhr\") pod \"heat-db-create-lx2gs\" (UID: \"7aa308ba-b5b7-4acb-9744-0254560e0a1f\") " pod="openstack/heat-db-create-lx2gs" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.069499 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmx2c\" (UniqueName: \"kubernetes.io/projected/6a46193a-2b19-4e7e-b0fe-090319010bba-kube-api-access-lmx2c\") pod \"neutron-db-create-7skc8\" (UID: \"6a46193a-2b19-4e7e-b0fe-090319010bba\") " pod="openstack/neutron-db-create-7skc8" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.070419 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7aa308ba-b5b7-4acb-9744-0254560e0a1f-operator-scripts\") pod \"heat-db-create-lx2gs\" (UID: \"7aa308ba-b5b7-4acb-9744-0254560e0a1f\") " pod="openstack/heat-db-create-lx2gs" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.095285 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44vhr\" (UniqueName: \"kubernetes.io/projected/7aa308ba-b5b7-4acb-9744-0254560e0a1f-kube-api-access-44vhr\") pod \"heat-db-create-lx2gs\" (UID: \"7aa308ba-b5b7-4acb-9744-0254560e0a1f\") " pod="openstack/heat-db-create-lx2gs" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.139123 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q52vm" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.157737 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-lx2gs" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.171751 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a4cdadf-d811-4803-8dc0-3815d029a281-operator-scripts\") pod \"heat-4908-account-create-update-plzkb\" (UID: \"2a4cdadf-d811-4803-8dc0-3815d029a281\") " pod="openstack/heat-4908-account-create-update-plzkb" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.171902 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vd95\" (UniqueName: \"kubernetes.io/projected/42d94310-d877-4e8d-b540-6526c1c26626-kube-api-access-4vd95\") pod \"neutron-7302-account-create-update-n2t2m\" (UID: \"42d94310-d877-4e8d-b540-6526c1c26626\") " pod="openstack/neutron-7302-account-create-update-n2t2m" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.171947 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a46193a-2b19-4e7e-b0fe-090319010bba-operator-scripts\") pod \"neutron-db-create-7skc8\" (UID: \"6a46193a-2b19-4e7e-b0fe-090319010bba\") " pod="openstack/neutron-db-create-7skc8" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.171982 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79922\" (UniqueName: \"kubernetes.io/projected/2a4cdadf-d811-4803-8dc0-3815d029a281-kube-api-access-79922\") pod \"heat-4908-account-create-update-plzkb\" (UID: \"2a4cdadf-d811-4803-8dc0-3815d029a281\") " pod="openstack/heat-4908-account-create-update-plzkb" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.172018 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42d94310-d877-4e8d-b540-6526c1c26626-operator-scripts\") pod \"neutron-7302-account-create-update-n2t2m\" (UID: \"42d94310-d877-4e8d-b540-6526c1c26626\") " pod="openstack/neutron-7302-account-create-update-n2t2m" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.172089 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmx2c\" (UniqueName: \"kubernetes.io/projected/6a46193a-2b19-4e7e-b0fe-090319010bba-kube-api-access-lmx2c\") pod \"neutron-db-create-7skc8\" (UID: \"6a46193a-2b19-4e7e-b0fe-090319010bba\") " pod="openstack/neutron-db-create-7skc8" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.172917 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a46193a-2b19-4e7e-b0fe-090319010bba-operator-scripts\") pod \"neutron-db-create-7skc8\" (UID: \"6a46193a-2b19-4e7e-b0fe-090319010bba\") " pod="openstack/neutron-db-create-7skc8" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.173061 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42d94310-d877-4e8d-b540-6526c1c26626-operator-scripts\") pod \"neutron-7302-account-create-update-n2t2m\" (UID: \"42d94310-d877-4e8d-b540-6526c1c26626\") " pod="openstack/neutron-7302-account-create-update-n2t2m" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.196252 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmx2c\" (UniqueName: \"kubernetes.io/projected/6a46193a-2b19-4e7e-b0fe-090319010bba-kube-api-access-lmx2c\") pod \"neutron-db-create-7skc8\" (UID: \"6a46193a-2b19-4e7e-b0fe-090319010bba\") " pod="openstack/neutron-db-create-7skc8" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.199342 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vd95\" (UniqueName: \"kubernetes.io/projected/42d94310-d877-4e8d-b540-6526c1c26626-kube-api-access-4vd95\") pod \"neutron-7302-account-create-update-n2t2m\" (UID: \"42d94310-d877-4e8d-b540-6526c1c26626\") " pod="openstack/neutron-7302-account-create-update-n2t2m" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.274265 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a4cdadf-d811-4803-8dc0-3815d029a281-operator-scripts\") pod \"heat-4908-account-create-update-plzkb\" (UID: \"2a4cdadf-d811-4803-8dc0-3815d029a281\") " pod="openstack/heat-4908-account-create-update-plzkb" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.274386 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79922\" (UniqueName: \"kubernetes.io/projected/2a4cdadf-d811-4803-8dc0-3815d029a281-kube-api-access-79922\") pod \"heat-4908-account-create-update-plzkb\" (UID: \"2a4cdadf-d811-4803-8dc0-3815d029a281\") " pod="openstack/heat-4908-account-create-update-plzkb" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.275182 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a4cdadf-d811-4803-8dc0-3815d029a281-operator-scripts\") pod \"heat-4908-account-create-update-plzkb\" (UID: \"2a4cdadf-d811-4803-8dc0-3815d029a281\") " pod="openstack/heat-4908-account-create-update-plzkb" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.296003 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79922\" (UniqueName: \"kubernetes.io/projected/2a4cdadf-d811-4803-8dc0-3815d029a281-kube-api-access-79922\") pod \"heat-4908-account-create-update-plzkb\" (UID: \"2a4cdadf-d811-4803-8dc0-3815d029a281\") " pod="openstack/heat-4908-account-create-update-plzkb" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.338859 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7skc8" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.346573 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7302-account-create-update-n2t2m" Dec 16 15:14:43 crc kubenswrapper[4775]: I1216 15:14:43.379779 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4908-account-create-update-plzkb" Dec 16 15:14:48 crc kubenswrapper[4775]: I1216 15:14:48.087821 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tj4s6"] Dec 16 15:14:48 crc kubenswrapper[4775]: W1216 15:14:48.103042 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e1160f7_8d33_49ce_a91c_7c4e95335f49.slice/crio-77addbbfd2720cbf2e5eaa7d58666aa299e872086a0ace56f66017b6b8276e8f WatchSource:0}: Error finding container 77addbbfd2720cbf2e5eaa7d58666aa299e872086a0ace56f66017b6b8276e8f: Status 404 returned error can't find the container with id 77addbbfd2720cbf2e5eaa7d58666aa299e872086a0ace56f66017b6b8276e8f Dec 16 15:14:48 crc kubenswrapper[4775]: I1216 15:14:48.499588 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-bb88-account-create-update-tdzmc"] Dec 16 15:14:48 crc kubenswrapper[4775]: I1216 15:14:48.523581 4775 generic.go:334] "Generic (PLEG): container finished" podID="8e1160f7-8d33-49ce-a91c-7c4e95335f49" containerID="c255aae9b300a2ec635893d98d4d6d4fcabd33d0368b9842e2d0028ed1d8853d" exitCode=0 Dec 16 15:14:48 crc kubenswrapper[4775]: I1216 15:14:48.523878 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tj4s6" event={"ID":"8e1160f7-8d33-49ce-a91c-7c4e95335f49","Type":"ContainerDied","Data":"c255aae9b300a2ec635893d98d4d6d4fcabd33d0368b9842e2d0028ed1d8853d"} Dec 16 15:14:48 crc kubenswrapper[4775]: I1216 15:14:48.524051 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tj4s6" event={"ID":"8e1160f7-8d33-49ce-a91c-7c4e95335f49","Type":"ContainerStarted","Data":"77addbbfd2720cbf2e5eaa7d58666aa299e872086a0ace56f66017b6b8276e8f"} Dec 16 15:14:48 crc kubenswrapper[4775]: I1216 15:14:48.526295 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-sxkbc"] Dec 16 15:14:48 crc kubenswrapper[4775]: I1216 15:14:48.673125 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7skc8"] Dec 16 15:14:48 crc kubenswrapper[4775]: I1216 15:14:48.684445 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7302-account-create-update-n2t2m"] Dec 16 15:14:48 crc kubenswrapper[4775]: I1216 15:14:48.691173 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-q52vm"] Dec 16 15:14:48 crc kubenswrapper[4775]: W1216 15:14:48.692878 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a4cdadf_d811_4803_8dc0_3815d029a281.slice/crio-ec9610e41d5191948e5456f2c31c4aa779ae09460ca6c09dfd24e6fd7b45fe5a WatchSource:0}: Error finding container ec9610e41d5191948e5456f2c31c4aa779ae09460ca6c09dfd24e6fd7b45fe5a: Status 404 returned error can't find the container with id ec9610e41d5191948e5456f2c31c4aa779ae09460ca6c09dfd24e6fd7b45fe5a Dec 16 15:14:48 crc kubenswrapper[4775]: I1216 15:14:48.699271 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-4908-account-create-update-plzkb"] Dec 16 15:14:48 crc kubenswrapper[4775]: I1216 15:14:48.704495 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 16 15:14:48 crc kubenswrapper[4775]: I1216 15:14:48.709912 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-aacd-account-create-update-swcfz"] Dec 16 15:14:48 crc kubenswrapper[4775]: I1216 15:14:48.845664 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-lx2gs"] Dec 16 15:14:48 crc kubenswrapper[4775]: W1216 15:14:48.851083 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7aa308ba_b5b7_4acb_9744_0254560e0a1f.slice/crio-1acc8dfbd00775225047c0d0e91ce826d648adc5765ec5e738e48a6b0aeb3162 WatchSource:0}: Error finding container 1acc8dfbd00775225047c0d0e91ce826d648adc5765ec5e738e48a6b0aeb3162: Status 404 returned error can't find the container with id 1acc8dfbd00775225047c0d0e91ce826d648adc5765ec5e738e48a6b0aeb3162 Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.534186 4775 generic.go:334] "Generic (PLEG): container finished" podID="6a46193a-2b19-4e7e-b0fe-090319010bba" containerID="c55627749e8172aa389502866530035ef0c596123f9be4c95c6be21c3cc8a398" exitCode=0 Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.534309 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7skc8" event={"ID":"6a46193a-2b19-4e7e-b0fe-090319010bba","Type":"ContainerDied","Data":"c55627749e8172aa389502866530035ef0c596123f9be4c95c6be21c3cc8a398"} Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.535832 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7skc8" event={"ID":"6a46193a-2b19-4e7e-b0fe-090319010bba","Type":"ContainerStarted","Data":"b9d33b5276ad3bf2c5c80088cb57d0e79ec11afd5110547435408e4cdc844e31"} Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.539779 4775 generic.go:334] "Generic (PLEG): container finished" podID="2a4cdadf-d811-4803-8dc0-3815d029a281" containerID="4e6a0d5c59ff5f9b62a48578ca6acdbd20efe99ea4ca344c216212989790000d" exitCode=0 Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.539869 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4908-account-create-update-plzkb" event={"ID":"2a4cdadf-d811-4803-8dc0-3815d029a281","Type":"ContainerDied","Data":"4e6a0d5c59ff5f9b62a48578ca6acdbd20efe99ea4ca344c216212989790000d"} Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.539922 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4908-account-create-update-plzkb" event={"ID":"2a4cdadf-d811-4803-8dc0-3815d029a281","Type":"ContainerStarted","Data":"ec9610e41d5191948e5456f2c31c4aa779ae09460ca6c09dfd24e6fd7b45fe5a"} Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.541331 4775 generic.go:334] "Generic (PLEG): container finished" podID="7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f" containerID="c3b9ea8723e61d765a76682c8a3a99f74027968fa26a8cf2943ecfd0066ff450" exitCode=0 Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.541404 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bb88-account-create-update-tdzmc" event={"ID":"7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f","Type":"ContainerDied","Data":"c3b9ea8723e61d765a76682c8a3a99f74027968fa26a8cf2943ecfd0066ff450"} Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.541428 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bb88-account-create-update-tdzmc" event={"ID":"7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f","Type":"ContainerStarted","Data":"633586c0441fde6b75e3f687839625fcc0052fad45ae2eb8e3f659d5ea4ca520"} Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.542921 4775 generic.go:334] "Generic (PLEG): container finished" podID="7aa308ba-b5b7-4acb-9744-0254560e0a1f" containerID="bf0331e99b5c0e9753ae27008d8cbe195a5971f9f2191725eae2b3aa84b9af32" exitCode=0 Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.542951 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-lx2gs" event={"ID":"7aa308ba-b5b7-4acb-9744-0254560e0a1f","Type":"ContainerDied","Data":"bf0331e99b5c0e9753ae27008d8cbe195a5971f9f2191725eae2b3aa84b9af32"} Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.542986 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-lx2gs" event={"ID":"7aa308ba-b5b7-4acb-9744-0254560e0a1f","Type":"ContainerStarted","Data":"1acc8dfbd00775225047c0d0e91ce826d648adc5765ec5e738e48a6b0aeb3162"} Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.544427 4775 generic.go:334] "Generic (PLEG): container finished" podID="42d94310-d877-4e8d-b540-6526c1c26626" containerID="2e9743547ab763bfb0e9a21cdce7c236a60c2e8ffb9ccd6357d6a629480e4dad" exitCode=0 Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.544485 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7302-account-create-update-n2t2m" event={"ID":"42d94310-d877-4e8d-b540-6526c1c26626","Type":"ContainerDied","Data":"2e9743547ab763bfb0e9a21cdce7c236a60c2e8ffb9ccd6357d6a629480e4dad"} Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.544506 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7302-account-create-update-n2t2m" event={"ID":"42d94310-d877-4e8d-b540-6526c1c26626","Type":"ContainerStarted","Data":"da4c66bec1675b4bf6518d7c69942b429d3dd8f80737ec885a7ab724f4030cf3"} Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.545602 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8b23fde4-e483-4825-969c-94ebc8396511","Type":"ContainerStarted","Data":"b4515cf935f827cdf7418176e4cd99f479e70cc689b8f6ab0d31a96b3404c504"} Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.547542 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q52vm" event={"ID":"aeb36c52-8730-43d7-a18e-6f1e3d8312ff","Type":"ContainerStarted","Data":"c5ae5db4dd014e9337baab8684f860500b7ea47de4395467519e7a7bb528bf9f"} Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.549409 4775 generic.go:334] "Generic (PLEG): container finished" podID="ba3c5008-fead-4517-9aed-d02d07560a0a" containerID="09b9c4b99da956b010933ec11e32239a5835d68ab66ab71457c14fb53743123d" exitCode=0 Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.553128 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-aacd-account-create-update-swcfz" event={"ID":"ba3c5008-fead-4517-9aed-d02d07560a0a","Type":"ContainerDied","Data":"09b9c4b99da956b010933ec11e32239a5835d68ab66ab71457c14fb53743123d"} Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.553328 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-aacd-account-create-update-swcfz" event={"ID":"ba3c5008-fead-4517-9aed-d02d07560a0a","Type":"ContainerStarted","Data":"893dd4f4d16ee851012cdd26da6420b2fe8b0afc2d182aff469e357ad5560fe9"} Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.555800 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7lsw2" event={"ID":"a8de491d-4c4f-44bc-82d5-7d571b4920e8","Type":"ContainerStarted","Data":"89f45c4e5c40a31bc7b6642a5e24cc71e2ec3ecd178924ef14458eb6a6f20b92"} Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.559296 4775 generic.go:334] "Generic (PLEG): container finished" podID="8c404df9-a26b-44f1-bd1a-be3f6877f896" containerID="0178115a38cede9ffec018b174720c1bb83b25f7ccd8307084b884ddf5703460" exitCode=0 Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.559377 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sxkbc" event={"ID":"8c404df9-a26b-44f1-bd1a-be3f6877f896","Type":"ContainerDied","Data":"0178115a38cede9ffec018b174720c1bb83b25f7ccd8307084b884ddf5703460"} Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.559404 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sxkbc" event={"ID":"8c404df9-a26b-44f1-bd1a-be3f6877f896","Type":"ContainerStarted","Data":"9ded97c02202e8426c23b88afaaf38463a798f23fa2bf1ec41ddbd0dac836d8b"} Dec 16 15:14:49 crc kubenswrapper[4775]: I1216 15:14:49.695650 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-7lsw2" podStartSLOduration=4.018014902 podStartE2EDuration="19.695624123s" podCreationTimestamp="2025-12-16 15:14:30 +0000 UTC" firstStartedPulling="2025-12-16 15:14:32.010058022 +0000 UTC m=+1196.961136945" lastFinishedPulling="2025-12-16 15:14:47.687667243 +0000 UTC m=+1212.638746166" observedRunningTime="2025-12-16 15:14:49.691260856 +0000 UTC m=+1214.642339799" watchObservedRunningTime="2025-12-16 15:14:49.695624123 +0000 UTC m=+1214.646703056" Dec 16 15:14:51 crc kubenswrapper[4775]: I1216 15:14:51.016795 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tj4s6" Dec 16 15:14:51 crc kubenswrapper[4775]: I1216 15:14:51.151937 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e1160f7-8d33-49ce-a91c-7c4e95335f49-operator-scripts\") pod \"8e1160f7-8d33-49ce-a91c-7c4e95335f49\" (UID: \"8e1160f7-8d33-49ce-a91c-7c4e95335f49\") " Dec 16 15:14:51 crc kubenswrapper[4775]: I1216 15:14:51.152015 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l22l9\" (UniqueName: \"kubernetes.io/projected/8e1160f7-8d33-49ce-a91c-7c4e95335f49-kube-api-access-l22l9\") pod \"8e1160f7-8d33-49ce-a91c-7c4e95335f49\" (UID: \"8e1160f7-8d33-49ce-a91c-7c4e95335f49\") " Dec 16 15:14:51 crc kubenswrapper[4775]: I1216 15:14:51.153467 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e1160f7-8d33-49ce-a91c-7c4e95335f49-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8e1160f7-8d33-49ce-a91c-7c4e95335f49" (UID: "8e1160f7-8d33-49ce-a91c-7c4e95335f49"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:51 crc kubenswrapper[4775]: I1216 15:14:51.174052 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e1160f7-8d33-49ce-a91c-7c4e95335f49-kube-api-access-l22l9" (OuterVolumeSpecName: "kube-api-access-l22l9") pod "8e1160f7-8d33-49ce-a91c-7c4e95335f49" (UID: "8e1160f7-8d33-49ce-a91c-7c4e95335f49"). InnerVolumeSpecName "kube-api-access-l22l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:51 crc kubenswrapper[4775]: I1216 15:14:51.287443 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e1160f7-8d33-49ce-a91c-7c4e95335f49-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:51 crc kubenswrapper[4775]: I1216 15:14:51.287485 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l22l9\" (UniqueName: \"kubernetes.io/projected/8e1160f7-8d33-49ce-a91c-7c4e95335f49-kube-api-access-l22l9\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:51 crc kubenswrapper[4775]: I1216 15:14:51.587007 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tj4s6" event={"ID":"8e1160f7-8d33-49ce-a91c-7c4e95335f49","Type":"ContainerDied","Data":"77addbbfd2720cbf2e5eaa7d58666aa299e872086a0ace56f66017b6b8276e8f"} Dec 16 15:14:51 crc kubenswrapper[4775]: I1216 15:14:51.587050 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77addbbfd2720cbf2e5eaa7d58666aa299e872086a0ace56f66017b6b8276e8f" Dec 16 15:14:51 crc kubenswrapper[4775]: I1216 15:14:51.587112 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tj4s6" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.616505 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-lx2gs" event={"ID":"7aa308ba-b5b7-4acb-9744-0254560e0a1f","Type":"ContainerDied","Data":"1acc8dfbd00775225047c0d0e91ce826d648adc5765ec5e738e48a6b0aeb3162"} Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.617158 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1acc8dfbd00775225047c0d0e91ce826d648adc5765ec5e738e48a6b0aeb3162" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.620758 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7302-account-create-update-n2t2m" event={"ID":"42d94310-d877-4e8d-b540-6526c1c26626","Type":"ContainerDied","Data":"da4c66bec1675b4bf6518d7c69942b429d3dd8f80737ec885a7ab724f4030cf3"} Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.620837 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da4c66bec1675b4bf6518d7c69942b429d3dd8f80737ec885a7ab724f4030cf3" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.622602 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4908-account-create-update-plzkb" event={"ID":"2a4cdadf-d811-4803-8dc0-3815d029a281","Type":"ContainerDied","Data":"ec9610e41d5191948e5456f2c31c4aa779ae09460ca6c09dfd24e6fd7b45fe5a"} Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.622654 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec9610e41d5191948e5456f2c31c4aa779ae09460ca6c09dfd24e6fd7b45fe5a" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.623983 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7skc8" event={"ID":"6a46193a-2b19-4e7e-b0fe-090319010bba","Type":"ContainerDied","Data":"b9d33b5276ad3bf2c5c80088cb57d0e79ec11afd5110547435408e4cdc844e31"} Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.624007 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9d33b5276ad3bf2c5c80088cb57d0e79ec11afd5110547435408e4cdc844e31" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.625399 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-aacd-account-create-update-swcfz" event={"ID":"ba3c5008-fead-4517-9aed-d02d07560a0a","Type":"ContainerDied","Data":"893dd4f4d16ee851012cdd26da6420b2fe8b0afc2d182aff469e357ad5560fe9"} Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.625435 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="893dd4f4d16ee851012cdd26da6420b2fe8b0afc2d182aff469e357ad5560fe9" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.627295 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bb88-account-create-update-tdzmc" event={"ID":"7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f","Type":"ContainerDied","Data":"633586c0441fde6b75e3f687839625fcc0052fad45ae2eb8e3f659d5ea4ca520"} Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.627353 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="633586c0441fde6b75e3f687839625fcc0052fad45ae2eb8e3f659d5ea4ca520" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.629496 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sxkbc" event={"ID":"8c404df9-a26b-44f1-bd1a-be3f6877f896","Type":"ContainerDied","Data":"9ded97c02202e8426c23b88afaaf38463a798f23fa2bf1ec41ddbd0dac836d8b"} Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.629536 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ded97c02202e8426c23b88afaaf38463a798f23fa2bf1ec41ddbd0dac836d8b" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.719891 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sxkbc" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.754495 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-aacd-account-create-update-swcfz" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.775022 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4908-account-create-update-plzkb" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.808170 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7skc8" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.821691 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bb88-account-create-update-tdzmc" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.828963 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jmdv\" (UniqueName: \"kubernetes.io/projected/8c404df9-a26b-44f1-bd1a-be3f6877f896-kube-api-access-7jmdv\") pod \"8c404df9-a26b-44f1-bd1a-be3f6877f896\" (UID: \"8c404df9-a26b-44f1-bd1a-be3f6877f896\") " Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.830743 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c404df9-a26b-44f1-bd1a-be3f6877f896-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c404df9-a26b-44f1-bd1a-be3f6877f896" (UID: "8c404df9-a26b-44f1-bd1a-be3f6877f896"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.831246 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7302-account-create-update-n2t2m" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.829041 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c404df9-a26b-44f1-bd1a-be3f6877f896-operator-scripts\") pod \"8c404df9-a26b-44f1-bd1a-be3f6877f896\" (UID: \"8c404df9-a26b-44f1-bd1a-be3f6877f896\") " Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.833471 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24nss\" (UniqueName: \"kubernetes.io/projected/7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f-kube-api-access-24nss\") pod \"7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f\" (UID: \"7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f\") " Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.833532 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmx2c\" (UniqueName: \"kubernetes.io/projected/6a46193a-2b19-4e7e-b0fe-090319010bba-kube-api-access-lmx2c\") pod \"6a46193a-2b19-4e7e-b0fe-090319010bba\" (UID: \"6a46193a-2b19-4e7e-b0fe-090319010bba\") " Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.833626 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79922\" (UniqueName: \"kubernetes.io/projected/2a4cdadf-d811-4803-8dc0-3815d029a281-kube-api-access-79922\") pod \"2a4cdadf-d811-4803-8dc0-3815d029a281\" (UID: \"2a4cdadf-d811-4803-8dc0-3815d029a281\") " Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.833711 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba3c5008-fead-4517-9aed-d02d07560a0a-operator-scripts\") pod \"ba3c5008-fead-4517-9aed-d02d07560a0a\" (UID: \"ba3c5008-fead-4517-9aed-d02d07560a0a\") " Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.834609 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c404df9-a26b-44f1-bd1a-be3f6877f896-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.839177 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a4cdadf-d811-4803-8dc0-3815d029a281-kube-api-access-79922" (OuterVolumeSpecName: "kube-api-access-79922") pod "2a4cdadf-d811-4803-8dc0-3815d029a281" (UID: "2a4cdadf-d811-4803-8dc0-3815d029a281"). InnerVolumeSpecName "kube-api-access-79922". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.840309 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3c5008-fead-4517-9aed-d02d07560a0a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba3c5008-fead-4517-9aed-d02d07560a0a" (UID: "ba3c5008-fead-4517-9aed-d02d07560a0a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.841326 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c404df9-a26b-44f1-bd1a-be3f6877f896-kube-api-access-7jmdv" (OuterVolumeSpecName: "kube-api-access-7jmdv") pod "8c404df9-a26b-44f1-bd1a-be3f6877f896" (UID: "8c404df9-a26b-44f1-bd1a-be3f6877f896"). InnerVolumeSpecName "kube-api-access-7jmdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.853109 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-lx2gs" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.854780 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f-kube-api-access-24nss" (OuterVolumeSpecName: "kube-api-access-24nss") pod "7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f" (UID: "7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f"). InnerVolumeSpecName "kube-api-access-24nss". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.856338 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a46193a-2b19-4e7e-b0fe-090319010bba-kube-api-access-lmx2c" (OuterVolumeSpecName: "kube-api-access-lmx2c") pod "6a46193a-2b19-4e7e-b0fe-090319010bba" (UID: "6a46193a-2b19-4e7e-b0fe-090319010bba"). InnerVolumeSpecName "kube-api-access-lmx2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.936077 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44vhr\" (UniqueName: \"kubernetes.io/projected/7aa308ba-b5b7-4acb-9744-0254560e0a1f-kube-api-access-44vhr\") pod \"7aa308ba-b5b7-4acb-9744-0254560e0a1f\" (UID: \"7aa308ba-b5b7-4acb-9744-0254560e0a1f\") " Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.936170 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7aa308ba-b5b7-4acb-9744-0254560e0a1f-operator-scripts\") pod \"7aa308ba-b5b7-4acb-9744-0254560e0a1f\" (UID: \"7aa308ba-b5b7-4acb-9744-0254560e0a1f\") " Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.936260 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f-operator-scripts\") pod \"7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f\" (UID: \"7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f\") " Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.936287 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42d94310-d877-4e8d-b540-6526c1c26626-operator-scripts\") pod \"42d94310-d877-4e8d-b540-6526c1c26626\" (UID: \"42d94310-d877-4e8d-b540-6526c1c26626\") " Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.936316 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a4cdadf-d811-4803-8dc0-3815d029a281-operator-scripts\") pod \"2a4cdadf-d811-4803-8dc0-3815d029a281\" (UID: \"2a4cdadf-d811-4803-8dc0-3815d029a281\") " Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.936337 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vd95\" (UniqueName: \"kubernetes.io/projected/42d94310-d877-4e8d-b540-6526c1c26626-kube-api-access-4vd95\") pod \"42d94310-d877-4e8d-b540-6526c1c26626\" (UID: \"42d94310-d877-4e8d-b540-6526c1c26626\") " Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.936350 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a46193a-2b19-4e7e-b0fe-090319010bba-operator-scripts\") pod \"6a46193a-2b19-4e7e-b0fe-090319010bba\" (UID: \"6a46193a-2b19-4e7e-b0fe-090319010bba\") " Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.936370 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck72q\" (UniqueName: \"kubernetes.io/projected/ba3c5008-fead-4517-9aed-d02d07560a0a-kube-api-access-ck72q\") pod \"ba3c5008-fead-4517-9aed-d02d07560a0a\" (UID: \"ba3c5008-fead-4517-9aed-d02d07560a0a\") " Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.936700 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmx2c\" (UniqueName: \"kubernetes.io/projected/6a46193a-2b19-4e7e-b0fe-090319010bba-kube-api-access-lmx2c\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.936716 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79922\" (UniqueName: \"kubernetes.io/projected/2a4cdadf-d811-4803-8dc0-3815d029a281-kube-api-access-79922\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.936730 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba3c5008-fead-4517-9aed-d02d07560a0a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.936743 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jmdv\" (UniqueName: \"kubernetes.io/projected/8c404df9-a26b-44f1-bd1a-be3f6877f896-kube-api-access-7jmdv\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.936752 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24nss\" (UniqueName: \"kubernetes.io/projected/7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f-kube-api-access-24nss\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.937489 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42d94310-d877-4e8d-b540-6526c1c26626-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "42d94310-d877-4e8d-b540-6526c1c26626" (UID: "42d94310-d877-4e8d-b540-6526c1c26626"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.937701 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a4cdadf-d811-4803-8dc0-3815d029a281-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a4cdadf-d811-4803-8dc0-3815d029a281" (UID: "2a4cdadf-d811-4803-8dc0-3815d029a281"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.937915 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f" (UID: "7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.938055 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7aa308ba-b5b7-4acb-9744-0254560e0a1f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7aa308ba-b5b7-4acb-9744-0254560e0a1f" (UID: "7aa308ba-b5b7-4acb-9744-0254560e0a1f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.938219 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a46193a-2b19-4e7e-b0fe-090319010bba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6a46193a-2b19-4e7e-b0fe-090319010bba" (UID: "6a46193a-2b19-4e7e-b0fe-090319010bba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.943060 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa308ba-b5b7-4acb-9744-0254560e0a1f-kube-api-access-44vhr" (OuterVolumeSpecName: "kube-api-access-44vhr") pod "7aa308ba-b5b7-4acb-9744-0254560e0a1f" (UID: "7aa308ba-b5b7-4acb-9744-0254560e0a1f"). InnerVolumeSpecName "kube-api-access-44vhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.943104 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42d94310-d877-4e8d-b540-6526c1c26626-kube-api-access-4vd95" (OuterVolumeSpecName: "kube-api-access-4vd95") pod "42d94310-d877-4e8d-b540-6526c1c26626" (UID: "42d94310-d877-4e8d-b540-6526c1c26626"). InnerVolumeSpecName "kube-api-access-4vd95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4775]: I1216 15:14:53.943426 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3c5008-fead-4517-9aed-d02d07560a0a-kube-api-access-ck72q" (OuterVolumeSpecName: "kube-api-access-ck72q") pod "ba3c5008-fead-4517-9aed-d02d07560a0a" (UID: "ba3c5008-fead-4517-9aed-d02d07560a0a"). InnerVolumeSpecName "kube-api-access-ck72q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.038446 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vd95\" (UniqueName: \"kubernetes.io/projected/42d94310-d877-4e8d-b540-6526c1c26626-kube-api-access-4vd95\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.038494 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a46193a-2b19-4e7e-b0fe-090319010bba-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.038505 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck72q\" (UniqueName: \"kubernetes.io/projected/ba3c5008-fead-4517-9aed-d02d07560a0a-kube-api-access-ck72q\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.038513 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44vhr\" (UniqueName: \"kubernetes.io/projected/7aa308ba-b5b7-4acb-9744-0254560e0a1f-kube-api-access-44vhr\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.038523 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7aa308ba-b5b7-4acb-9744-0254560e0a1f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.038532 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.038543 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42d94310-d877-4e8d-b540-6526c1c26626-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.038552 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a4cdadf-d811-4803-8dc0-3815d029a281-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.638480 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q52vm" event={"ID":"aeb36c52-8730-43d7-a18e-6f1e3d8312ff","Type":"ContainerStarted","Data":"472065880d8deff99c6a79d6f32037d302c0ed561ce5bf9802f180450b854c26"} Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.641522 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7skc8" Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.644999 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bb88-account-create-update-tdzmc" Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.645050 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7302-account-create-update-n2t2m" Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.645069 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8b23fde4-e483-4825-969c-94ebc8396511","Type":"ContainerStarted","Data":"2d62a93833622f6ef354ebf02b0d974af11f1c1df07524aff0942dcd3a946114"} Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.645096 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8b23fde4-e483-4825-969c-94ebc8396511","Type":"ContainerStarted","Data":"5b03bd5343678977c56025674a85859b6eaabe206b5b152725bc66abb3f50c5c"} Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.645106 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8b23fde4-e483-4825-969c-94ebc8396511","Type":"ContainerStarted","Data":"4beb2a5bd669805ffa3ea2d140235eddd800fa5f974c5fab030425036451a3e7"} Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.645114 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8b23fde4-e483-4825-969c-94ebc8396511","Type":"ContainerStarted","Data":"24a276a89e483a49aaa518f0e5fb3877e96ee64f8364ca8fdfb32ad7307e0a08"} Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.646189 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-aacd-account-create-update-swcfz" Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.646227 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sxkbc" Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.646264 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-lx2gs" Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.646293 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4908-account-create-update-plzkb" Dec 16 15:14:54 crc kubenswrapper[4775]: I1216 15:14:54.669157 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-q52vm" podStartSLOduration=7.887635706 podStartE2EDuration="12.669134486s" podCreationTimestamp="2025-12-16 15:14:42 +0000 UTC" firstStartedPulling="2025-12-16 15:14:48.723680278 +0000 UTC m=+1213.674759211" lastFinishedPulling="2025-12-16 15:14:53.505179068 +0000 UTC m=+1218.456257991" observedRunningTime="2025-12-16 15:14:54.662545288 +0000 UTC m=+1219.613624221" watchObservedRunningTime="2025-12-16 15:14:54.669134486 +0000 UTC m=+1219.620213409" Dec 16 15:14:56 crc kubenswrapper[4775]: I1216 15:14:56.711448 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8b23fde4-e483-4825-969c-94ebc8396511","Type":"ContainerStarted","Data":"631d0b4b548058e609c0fa2b5c1c2142a72f24df0e4080ab9ef5b73e24ce50e9"} Dec 16 15:14:56 crc kubenswrapper[4775]: I1216 15:14:56.712343 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8b23fde4-e483-4825-969c-94ebc8396511","Type":"ContainerStarted","Data":"9fedc5cbb69d2296947313cc88f3c0cdc375122d127d717919df91fc7911a06b"} Dec 16 15:14:56 crc kubenswrapper[4775]: I1216 15:14:56.712361 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8b23fde4-e483-4825-969c-94ebc8396511","Type":"ContainerStarted","Data":"43b72504817706aec4ab1a14833f74745e22942aa3980edc7b51d4b80063ac07"} Dec 16 15:14:56 crc kubenswrapper[4775]: I1216 15:14:56.712396 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8b23fde4-e483-4825-969c-94ebc8396511","Type":"ContainerStarted","Data":"d5a03cd2b1b141c6dc02f23ec496a011f5f7f2026ba0f552548a36bd16006f14"} Dec 16 15:14:57 crc kubenswrapper[4775]: I1216 15:14:57.719988 4775 generic.go:334] "Generic (PLEG): container finished" podID="a8de491d-4c4f-44bc-82d5-7d571b4920e8" containerID="89f45c4e5c40a31bc7b6642a5e24cc71e2ec3ecd178924ef14458eb6a6f20b92" exitCode=0 Dec 16 15:14:57 crc kubenswrapper[4775]: I1216 15:14:57.720045 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7lsw2" event={"ID":"a8de491d-4c4f-44bc-82d5-7d571b4920e8","Type":"ContainerDied","Data":"89f45c4e5c40a31bc7b6642a5e24cc71e2ec3ecd178924ef14458eb6a6f20b92"} Dec 16 15:14:58 crc kubenswrapper[4775]: I1216 15:14:58.735222 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8b23fde4-e483-4825-969c-94ebc8396511","Type":"ContainerStarted","Data":"99ed6761fc19ba995da5918138b62c6e949a19a365a8ae8e76904deaaf961cb1"} Dec 16 15:14:58 crc kubenswrapper[4775]: I1216 15:14:58.736840 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8b23fde4-e483-4825-969c-94ebc8396511","Type":"ContainerStarted","Data":"4dcea219686d983977d17079547b05ebbffc5c6234d373952ad9d1ea13e0512b"} Dec 16 15:14:58 crc kubenswrapper[4775]: I1216 15:14:58.736980 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8b23fde4-e483-4825-969c-94ebc8396511","Type":"ContainerStarted","Data":"2c6e46c41db3886f1e4f0a44d3cd7debef1528e63a1cb14d5d84f6f0e2fa9eb3"} Dec 16 15:14:58 crc kubenswrapper[4775]: I1216 15:14:58.737103 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8b23fde4-e483-4825-969c-94ebc8396511","Type":"ContainerStarted","Data":"1bab3048d95d58c413a08723a46de5dac11689fdd0d1c7ec52c5dd4d2f7558b4"} Dec 16 15:14:58 crc kubenswrapper[4775]: I1216 15:14:58.737212 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8b23fde4-e483-4825-969c-94ebc8396511","Type":"ContainerStarted","Data":"36013bb065c735db80045a19b8bf07f8eecd19cee211f305ff977675a5798995"} Dec 16 15:14:58 crc kubenswrapper[4775]: I1216 15:14:58.737331 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q52vm" event={"ID":"aeb36c52-8730-43d7-a18e-6f1e3d8312ff","Type":"ContainerDied","Data":"472065880d8deff99c6a79d6f32037d302c0ed561ce5bf9802f180450b854c26"} Dec 16 15:14:58 crc kubenswrapper[4775]: I1216 15:14:58.737105 4775 generic.go:334] "Generic (PLEG): container finished" podID="aeb36c52-8730-43d7-a18e-6f1e3d8312ff" containerID="472065880d8deff99c6a79d6f32037d302c0ed561ce5bf9802f180450b854c26" exitCode=0 Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.146837 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7lsw2" Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.228969 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8de491d-4c4f-44bc-82d5-7d571b4920e8-config-data\") pod \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\" (UID: \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\") " Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.229087 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5vpd\" (UniqueName: \"kubernetes.io/projected/a8de491d-4c4f-44bc-82d5-7d571b4920e8-kube-api-access-p5vpd\") pod \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\" (UID: \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\") " Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.229186 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8de491d-4c4f-44bc-82d5-7d571b4920e8-db-sync-config-data\") pod \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\" (UID: \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\") " Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.229266 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8de491d-4c4f-44bc-82d5-7d571b4920e8-combined-ca-bundle\") pod \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\" (UID: \"a8de491d-4c4f-44bc-82d5-7d571b4920e8\") " Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.237158 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8de491d-4c4f-44bc-82d5-7d571b4920e8-kube-api-access-p5vpd" (OuterVolumeSpecName: "kube-api-access-p5vpd") pod "a8de491d-4c4f-44bc-82d5-7d571b4920e8" (UID: "a8de491d-4c4f-44bc-82d5-7d571b4920e8"). InnerVolumeSpecName "kube-api-access-p5vpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.238136 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8de491d-4c4f-44bc-82d5-7d571b4920e8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a8de491d-4c4f-44bc-82d5-7d571b4920e8" (UID: "a8de491d-4c4f-44bc-82d5-7d571b4920e8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.253400 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8de491d-4c4f-44bc-82d5-7d571b4920e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8de491d-4c4f-44bc-82d5-7d571b4920e8" (UID: "a8de491d-4c4f-44bc-82d5-7d571b4920e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.273552 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8de491d-4c4f-44bc-82d5-7d571b4920e8-config-data" (OuterVolumeSpecName: "config-data") pod "a8de491d-4c4f-44bc-82d5-7d571b4920e8" (UID: "a8de491d-4c4f-44bc-82d5-7d571b4920e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.331132 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8de491d-4c4f-44bc-82d5-7d571b4920e8-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.331168 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5vpd\" (UniqueName: \"kubernetes.io/projected/a8de491d-4c4f-44bc-82d5-7d571b4920e8-kube-api-access-p5vpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.331181 4775 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8de491d-4c4f-44bc-82d5-7d571b4920e8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.331192 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8de491d-4c4f-44bc-82d5-7d571b4920e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.779569 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7lsw2" event={"ID":"a8de491d-4c4f-44bc-82d5-7d571b4920e8","Type":"ContainerDied","Data":"8bd9c4acff079d8f054f40e1fa3fc09a6b3f4a98336a821acf71b21ff534bcae"} Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.779796 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bd9c4acff079d8f054f40e1fa3fc09a6b3f4a98336a821acf71b21ff534bcae" Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.779643 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7lsw2" Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.788513 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8b23fde4-e483-4825-969c-94ebc8396511","Type":"ContainerStarted","Data":"413e09c7e4c0c6891024e58ac3c92ff37f00d920f804187b6772f95be411cf05"} Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.788578 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8b23fde4-e483-4825-969c-94ebc8396511","Type":"ContainerStarted","Data":"fed9763f73d6c11ed71ba18945290803a347f8e50694fc69a0cfc7fd06ed7945"} Dec 16 15:14:59 crc kubenswrapper[4775]: I1216 15:14:59.844281 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=43.922045229 podStartE2EDuration="52.844256903s" podCreationTimestamp="2025-12-16 15:14:07 +0000 UTC" firstStartedPulling="2025-12-16 15:14:48.726689202 +0000 UTC m=+1213.677768125" lastFinishedPulling="2025-12-16 15:14:57.648900876 +0000 UTC m=+1222.599979799" observedRunningTime="2025-12-16 15:14:59.829061815 +0000 UTC m=+1224.780140768" watchObservedRunningTime="2025-12-16 15:14:59.844256903 +0000 UTC m=+1224.795335826" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.143259 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-s9dpj"] Dec 16 15:15:00 crc kubenswrapper[4775]: E1216 15:15:00.143701 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a4cdadf-d811-4803-8dc0-3815d029a281" containerName="mariadb-account-create-update" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.143725 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a4cdadf-d811-4803-8dc0-3815d029a281" containerName="mariadb-account-create-update" Dec 16 15:15:00 crc kubenswrapper[4775]: E1216 15:15:00.143738 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3c5008-fead-4517-9aed-d02d07560a0a" containerName="mariadb-account-create-update" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.143747 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3c5008-fead-4517-9aed-d02d07560a0a" containerName="mariadb-account-create-update" Dec 16 15:15:00 crc kubenswrapper[4775]: E1216 15:15:00.143759 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa308ba-b5b7-4acb-9744-0254560e0a1f" containerName="mariadb-database-create" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.143768 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa308ba-b5b7-4acb-9744-0254560e0a1f" containerName="mariadb-database-create" Dec 16 15:15:00 crc kubenswrapper[4775]: E1216 15:15:00.143785 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c404df9-a26b-44f1-bd1a-be3f6877f896" containerName="mariadb-database-create" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.143793 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c404df9-a26b-44f1-bd1a-be3f6877f896" containerName="mariadb-database-create" Dec 16 15:15:00 crc kubenswrapper[4775]: E1216 15:15:00.143805 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8de491d-4c4f-44bc-82d5-7d571b4920e8" containerName="glance-db-sync" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.143814 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8de491d-4c4f-44bc-82d5-7d571b4920e8" containerName="glance-db-sync" Dec 16 15:15:00 crc kubenswrapper[4775]: E1216 15:15:00.143824 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d94310-d877-4e8d-b540-6526c1c26626" containerName="mariadb-account-create-update" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.143831 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d94310-d877-4e8d-b540-6526c1c26626" containerName="mariadb-account-create-update" Dec 16 15:15:00 crc kubenswrapper[4775]: E1216 15:15:00.143841 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f" containerName="mariadb-account-create-update" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.143849 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f" containerName="mariadb-account-create-update" Dec 16 15:15:00 crc kubenswrapper[4775]: E1216 15:15:00.143866 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a46193a-2b19-4e7e-b0fe-090319010bba" containerName="mariadb-database-create" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.143874 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a46193a-2b19-4e7e-b0fe-090319010bba" containerName="mariadb-database-create" Dec 16 15:15:00 crc kubenswrapper[4775]: E1216 15:15:00.143906 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1160f7-8d33-49ce-a91c-7c4e95335f49" containerName="mariadb-database-create" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.143915 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1160f7-8d33-49ce-a91c-7c4e95335f49" containerName="mariadb-database-create" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.144151 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba3c5008-fead-4517-9aed-d02d07560a0a" containerName="mariadb-account-create-update" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.144169 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a46193a-2b19-4e7e-b0fe-090319010bba" containerName="mariadb-database-create" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.144178 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e1160f7-8d33-49ce-a91c-7c4e95335f49" containerName="mariadb-database-create" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.144195 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8de491d-4c4f-44bc-82d5-7d571b4920e8" containerName="glance-db-sync" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.144208 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d94310-d877-4e8d-b540-6526c1c26626" containerName="mariadb-account-create-update" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.144217 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f" containerName="mariadb-account-create-update" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.144236 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa308ba-b5b7-4acb-9744-0254560e0a1f" containerName="mariadb-database-create" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.144250 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a4cdadf-d811-4803-8dc0-3815d029a281" containerName="mariadb-account-create-update" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.144259 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c404df9-a26b-44f1-bd1a-be3f6877f896" containerName="mariadb-database-create" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.147257 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.166470 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm"] Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.167657 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.172375 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.172796 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.187003 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-s9dpj"] Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.209269 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q52vm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.209605 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm"] Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.247233 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-946lv\" (UniqueName: \"kubernetes.io/projected/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96-kube-api-access-946lv\") pod \"collect-profiles-29431635-s9dsm\" (UID: \"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.247293 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-s9dpj\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.247322 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-s9dpj\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.247482 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96-secret-volume\") pod \"collect-profiles-29431635-s9dsm\" (UID: \"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.247586 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tswzx\" (UniqueName: \"kubernetes.io/projected/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-kube-api-access-tswzx\") pod \"dnsmasq-dns-5b946c75cc-s9dpj\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.247627 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96-config-volume\") pod \"collect-profiles-29431635-s9dsm\" (UID: \"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.247654 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-config\") pod \"dnsmasq-dns-5b946c75cc-s9dpj\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.247802 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-s9dpj\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.325051 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-s9dpj"] Dec 16 15:15:00 crc kubenswrapper[4775]: E1216 15:15:00.325862 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-tswzx ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" podUID="9b1762cf-2e5e-433b-be23-2e3fe5433c4b" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.349523 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbnmx\" (UniqueName: \"kubernetes.io/projected/aeb36c52-8730-43d7-a18e-6f1e3d8312ff-kube-api-access-rbnmx\") pod \"aeb36c52-8730-43d7-a18e-6f1e3d8312ff\" (UID: \"aeb36c52-8730-43d7-a18e-6f1e3d8312ff\") " Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.349702 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb36c52-8730-43d7-a18e-6f1e3d8312ff-combined-ca-bundle\") pod \"aeb36c52-8730-43d7-a18e-6f1e3d8312ff\" (UID: \"aeb36c52-8730-43d7-a18e-6f1e3d8312ff\") " Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.349794 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb36c52-8730-43d7-a18e-6f1e3d8312ff-config-data\") pod \"aeb36c52-8730-43d7-a18e-6f1e3d8312ff\" (UID: \"aeb36c52-8730-43d7-a18e-6f1e3d8312ff\") " Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.350199 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-s9dpj\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.350306 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-946lv\" (UniqueName: \"kubernetes.io/projected/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96-kube-api-access-946lv\") pod \"collect-profiles-29431635-s9dsm\" (UID: \"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.350328 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-s9dpj\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.350350 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-s9dpj\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.350403 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96-secret-volume\") pod \"collect-profiles-29431635-s9dsm\" (UID: \"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.350443 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tswzx\" (UniqueName: \"kubernetes.io/projected/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-kube-api-access-tswzx\") pod \"dnsmasq-dns-5b946c75cc-s9dpj\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.350468 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96-config-volume\") pod \"collect-profiles-29431635-s9dsm\" (UID: \"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.350492 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-config\") pod \"dnsmasq-dns-5b946c75cc-s9dpj\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.351844 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-s9dpj\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.351993 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-s9dpj\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.352851 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-config\") pod \"dnsmasq-dns-5b946c75cc-s9dpj\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.353452 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96-config-volume\") pod \"collect-profiles-29431635-s9dsm\" (UID: \"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.357147 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-s9dpj\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.358565 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96-secret-volume\") pod \"collect-profiles-29431635-s9dsm\" (UID: \"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.367429 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-j6sdm"] Dec 16 15:15:00 crc kubenswrapper[4775]: E1216 15:15:00.370176 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb36c52-8730-43d7-a18e-6f1e3d8312ff" containerName="keystone-db-sync" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.370217 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb36c52-8730-43d7-a18e-6f1e3d8312ff" containerName="keystone-db-sync" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.370643 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeb36c52-8730-43d7-a18e-6f1e3d8312ff" containerName="keystone-db-sync" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.372039 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.393333 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-j6sdm"] Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.421768 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.422152 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeb36c52-8730-43d7-a18e-6f1e3d8312ff-kube-api-access-rbnmx" (OuterVolumeSpecName: "kube-api-access-rbnmx") pod "aeb36c52-8730-43d7-a18e-6f1e3d8312ff" (UID: "aeb36c52-8730-43d7-a18e-6f1e3d8312ff"). InnerVolumeSpecName "kube-api-access-rbnmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.423507 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-946lv\" (UniqueName: \"kubernetes.io/projected/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96-kube-api-access-946lv\") pod \"collect-profiles-29431635-s9dsm\" (UID: \"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.425743 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb36c52-8730-43d7-a18e-6f1e3d8312ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aeb36c52-8730-43d7-a18e-6f1e3d8312ff" (UID: "aeb36c52-8730-43d7-a18e-6f1e3d8312ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.427103 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tswzx\" (UniqueName: \"kubernetes.io/projected/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-kube-api-access-tswzx\") pod \"dnsmasq-dns-5b946c75cc-s9dpj\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.452100 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-config\") pod \"dnsmasq-dns-74f6bcbc87-j6sdm\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.452161 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-j6sdm\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.452195 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv9bj\" (UniqueName: \"kubernetes.io/projected/6064a5c8-9b4f-44bc-8041-7d5972634060-kube-api-access-qv9bj\") pod \"dnsmasq-dns-74f6bcbc87-j6sdm\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.452261 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-j6sdm\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.452310 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-j6sdm\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.452351 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-j6sdm\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.452420 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb36c52-8730-43d7-a18e-6f1e3d8312ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.452435 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbnmx\" (UniqueName: \"kubernetes.io/projected/aeb36c52-8730-43d7-a18e-6f1e3d8312ff-kube-api-access-rbnmx\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.465975 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb36c52-8730-43d7-a18e-6f1e3d8312ff-config-data" (OuterVolumeSpecName: "config-data") pod "aeb36c52-8730-43d7-a18e-6f1e3d8312ff" (UID: "aeb36c52-8730-43d7-a18e-6f1e3d8312ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.549273 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.554298 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv9bj\" (UniqueName: \"kubernetes.io/projected/6064a5c8-9b4f-44bc-8041-7d5972634060-kube-api-access-qv9bj\") pod \"dnsmasq-dns-74f6bcbc87-j6sdm\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.554449 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-j6sdm\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.555491 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-j6sdm\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.555591 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-j6sdm\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.555646 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-j6sdm\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.555718 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-config\") pod \"dnsmasq-dns-74f6bcbc87-j6sdm\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.555777 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-j6sdm\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.555841 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb36c52-8730-43d7-a18e-6f1e3d8312ff-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.556510 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-j6sdm\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.557211 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-j6sdm\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.557769 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-j6sdm\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.558355 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-config\") pod \"dnsmasq-dns-74f6bcbc87-j6sdm\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.572769 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv9bj\" (UniqueName: \"kubernetes.io/projected/6064a5c8-9b4f-44bc-8041-7d5972634060-kube-api-access-qv9bj\") pod \"dnsmasq-dns-74f6bcbc87-j6sdm\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.751818 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.814234 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q52vm" event={"ID":"aeb36c52-8730-43d7-a18e-6f1e3d8312ff","Type":"ContainerDied","Data":"c5ae5db4dd014e9337baab8684f860500b7ea47de4395467519e7a7bb528bf9f"} Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.815199 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5ae5db4dd014e9337baab8684f860500b7ea47de4395467519e7a7bb528bf9f" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.815277 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q52vm" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.815632 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.870923 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.964768 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-dns-svc\") pod \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.964854 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-ovsdbserver-sb\") pod \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.965139 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-ovsdbserver-nb\") pod \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.965214 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tswzx\" (UniqueName: \"kubernetes.io/projected/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-kube-api-access-tswzx\") pod \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.965262 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-config\") pod \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\" (UID: \"9b1762cf-2e5e-433b-be23-2e3fe5433c4b\") " Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.966280 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9b1762cf-2e5e-433b-be23-2e3fe5433c4b" (UID: "9b1762cf-2e5e-433b-be23-2e3fe5433c4b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.966306 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9b1762cf-2e5e-433b-be23-2e3fe5433c4b" (UID: "9b1762cf-2e5e-433b-be23-2e3fe5433c4b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.968487 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9b1762cf-2e5e-433b-be23-2e3fe5433c4b" (UID: "9b1762cf-2e5e-433b-be23-2e3fe5433c4b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.968521 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-config" (OuterVolumeSpecName: "config") pod "9b1762cf-2e5e-433b-be23-2e3fe5433c4b" (UID: "9b1762cf-2e5e-433b-be23-2e3fe5433c4b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:00 crc kubenswrapper[4775]: I1216 15:15:00.974176 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-kube-api-access-tswzx" (OuterVolumeSpecName: "kube-api-access-tswzx") pod "9b1762cf-2e5e-433b-be23-2e3fe5433c4b" (UID: "9b1762cf-2e5e-433b-be23-2e3fe5433c4b"). InnerVolumeSpecName "kube-api-access-tswzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.052015 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm"] Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.067272 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.067611 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.067708 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tswzx\" (UniqueName: \"kubernetes.io/projected/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-kube-api-access-tswzx\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.067804 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.067878 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b1762cf-2e5e-433b-be23-2e3fe5433c4b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.115869 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-j6sdm"] Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.129241 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bw8mh"] Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.130757 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.140493 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.140725 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.140795 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rsk8n" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.141098 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.141509 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.159352 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-2nctf"] Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.161239 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.170043 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-combined-ca-bundle\") pod \"keystone-bootstrap-bw8mh\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.170090 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzn4c\" (UniqueName: \"kubernetes.io/projected/652d1ed1-db2e-48c7-8409-69047378a6fe-kube-api-access-xzn4c\") pod \"keystone-bootstrap-bw8mh\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.170117 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-scripts\") pod \"keystone-bootstrap-bw8mh\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.170146 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-credential-keys\") pod \"keystone-bootstrap-bw8mh\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.170169 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-config-data\") pod \"keystone-bootstrap-bw8mh\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.170187 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-fernet-keys\") pod \"keystone-bootstrap-bw8mh\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.210337 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bw8mh"] Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.242172 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-2nctf"] Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.272034 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-2nctf\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.272162 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-combined-ca-bundle\") pod \"keystone-bootstrap-bw8mh\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.272209 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzn4c\" (UniqueName: \"kubernetes.io/projected/652d1ed1-db2e-48c7-8409-69047378a6fe-kube-api-access-xzn4c\") pod \"keystone-bootstrap-bw8mh\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.272227 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-scripts\") pod \"keystone-bootstrap-bw8mh\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.272253 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-config\") pod \"dnsmasq-dns-847c4cc679-2nctf\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.272271 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjnnj\" (UniqueName: \"kubernetes.io/projected/4295fd10-669f-4d0f-81bd-8cf04c0d4704-kube-api-access-zjnnj\") pod \"dnsmasq-dns-847c4cc679-2nctf\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.272305 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-credential-keys\") pod \"keystone-bootstrap-bw8mh\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.272327 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-dns-svc\") pod \"dnsmasq-dns-847c4cc679-2nctf\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.272363 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-config-data\") pod \"keystone-bootstrap-bw8mh\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.272389 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-fernet-keys\") pod \"keystone-bootstrap-bw8mh\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.272422 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-2nctf\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.272504 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-2nctf\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.299389 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-combined-ca-bundle\") pod \"keystone-bootstrap-bw8mh\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.328726 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-config-data\") pod \"keystone-bootstrap-bw8mh\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.328805 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-credential-keys\") pod \"keystone-bootstrap-bw8mh\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.365806 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-scripts\") pod \"keystone-bootstrap-bw8mh\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.366206 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-fernet-keys\") pod \"keystone-bootstrap-bw8mh\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.375067 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-2nctf\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.375665 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-2nctf\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.375903 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-config\") pod \"dnsmasq-dns-847c4cc679-2nctf\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.376023 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjnnj\" (UniqueName: \"kubernetes.io/projected/4295fd10-669f-4d0f-81bd-8cf04c0d4704-kube-api-access-zjnnj\") pod \"dnsmasq-dns-847c4cc679-2nctf\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.376127 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-dns-svc\") pod \"dnsmasq-dns-847c4cc679-2nctf\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.376258 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-2nctf\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.377557 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-2nctf\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.378756 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-2nctf\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.379522 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-2nctf\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.409684 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-config\") pod \"dnsmasq-dns-847c4cc679-2nctf\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.417036 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-dns-svc\") pod \"dnsmasq-dns-847c4cc679-2nctf\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.424123 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzn4c\" (UniqueName: \"kubernetes.io/projected/652d1ed1-db2e-48c7-8409-69047378a6fe-kube-api-access-xzn4c\") pod \"keystone-bootstrap-bw8mh\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.504121 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-j6sdm"] Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.506161 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjnnj\" (UniqueName: \"kubernetes.io/projected/4295fd10-669f-4d0f-81bd-8cf04c0d4704-kube-api-access-zjnnj\") pod \"dnsmasq-dns-847c4cc679-2nctf\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.552652 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.588184 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-9j6xx"] Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.589879 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-9j6xx" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.599056 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.604187 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-fspqq" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.629487 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.685966 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-9j6xx"] Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.709202 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcp7h\" (UniqueName: \"kubernetes.io/projected/23611da1-3f26-42c4-bd23-36e0b04bdc24-kube-api-access-mcp7h\") pod \"heat-db-sync-9j6xx\" (UID: \"23611da1-3f26-42c4-bd23-36e0b04bdc24\") " pod="openstack/heat-db-sync-9j6xx" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.713197 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23611da1-3f26-42c4-bd23-36e0b04bdc24-config-data\") pod \"heat-db-sync-9j6xx\" (UID: \"23611da1-3f26-42c4-bd23-36e0b04bdc24\") " pod="openstack/heat-db-sync-9j6xx" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.713436 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23611da1-3f26-42c4-bd23-36e0b04bdc24-combined-ca-bundle\") pod \"heat-db-sync-9j6xx\" (UID: \"23611da1-3f26-42c4-bd23-36e0b04bdc24\") " pod="openstack/heat-db-sync-9j6xx" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.740799 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-cjjlj"] Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.742366 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.747401 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6n6mx" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.747714 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.747849 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.770228 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-cjjlj"] Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.814399 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23611da1-3f26-42c4-bd23-36e0b04bdc24-config-data\") pod \"heat-db-sync-9j6xx\" (UID: \"23611da1-3f26-42c4-bd23-36e0b04bdc24\") " pod="openstack/heat-db-sync-9j6xx" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.814451 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23611da1-3f26-42c4-bd23-36e0b04bdc24-combined-ca-bundle\") pod \"heat-db-sync-9j6xx\" (UID: \"23611da1-3f26-42c4-bd23-36e0b04bdc24\") " pod="openstack/heat-db-sync-9j6xx" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.814483 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d12e79-44d3-4b3a-bd17-af547a42fc19-combined-ca-bundle\") pod \"placement-db-sync-cjjlj\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.814504 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d12e79-44d3-4b3a-bd17-af547a42fc19-config-data\") pod \"placement-db-sync-cjjlj\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.814530 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d12e79-44d3-4b3a-bd17-af547a42fc19-scripts\") pod \"placement-db-sync-cjjlj\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.814550 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10d12e79-44d3-4b3a-bd17-af547a42fc19-logs\") pod \"placement-db-sync-cjjlj\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.814588 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srtdk\" (UniqueName: \"kubernetes.io/projected/10d12e79-44d3-4b3a-bd17-af547a42fc19-kube-api-access-srtdk\") pod \"placement-db-sync-cjjlj\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.814609 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcp7h\" (UniqueName: \"kubernetes.io/projected/23611da1-3f26-42c4-bd23-36e0b04bdc24-kube-api-access-mcp7h\") pod \"heat-db-sync-9j6xx\" (UID: \"23611da1-3f26-42c4-bd23-36e0b04bdc24\") " pod="openstack/heat-db-sync-9j6xx" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.851586 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-sw8rd"] Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.853048 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sw8rd" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.855622 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" event={"ID":"6064a5c8-9b4f-44bc-8041-7d5972634060","Type":"ContainerStarted","Data":"e41e3b3f78cb32bb94cb214907d00642d13e52ff229624e66572a0cc0579fe64"} Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.874708 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-g5kw7" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.874723 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.875374 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23611da1-3f26-42c4-bd23-36e0b04bdc24-combined-ca-bundle\") pod \"heat-db-sync-9j6xx\" (UID: \"23611da1-3f26-42c4-bd23-36e0b04bdc24\") " pod="openstack/heat-db-sync-9j6xx" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.875498 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23611da1-3f26-42c4-bd23-36e0b04bdc24-config-data\") pod \"heat-db-sync-9j6xx\" (UID: \"23611da1-3f26-42c4-bd23-36e0b04bdc24\") " pod="openstack/heat-db-sync-9j6xx" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.884457 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-s9dpj" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.885202 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm" event={"ID":"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96","Type":"ContainerStarted","Data":"95c74dbb0eb319a6f6b01a923daefdf384e25eab4dbd3b122d18d3d5fdf16d24"} Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.885264 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm" event={"ID":"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96","Type":"ContainerStarted","Data":"486732d40e8563ba35f5a355f852dcd17ac223c450464cc53d8ba5d8b60b8971"} Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.894293 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-j4mx8"] Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.894636 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcp7h\" (UniqueName: \"kubernetes.io/projected/23611da1-3f26-42c4-bd23-36e0b04bdc24-kube-api-access-mcp7h\") pod \"heat-db-sync-9j6xx\" (UID: \"23611da1-3f26-42c4-bd23-36e0b04bdc24\") " pod="openstack/heat-db-sync-9j6xx" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.896191 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.909595 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.910298 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9nnnp" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.910744 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.920383 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d12e79-44d3-4b3a-bd17-af547a42fc19-config-data\") pod \"placement-db-sync-cjjlj\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.920487 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d12e79-44d3-4b3a-bd17-af547a42fc19-scripts\") pod \"placement-db-sync-cjjlj\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.920529 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10d12e79-44d3-4b3a-bd17-af547a42fc19-logs\") pod \"placement-db-sync-cjjlj\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.920613 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srtdk\" (UniqueName: \"kubernetes.io/projected/10d12e79-44d3-4b3a-bd17-af547a42fc19-kube-api-access-srtdk\") pod \"placement-db-sync-cjjlj\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.920802 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d12e79-44d3-4b3a-bd17-af547a42fc19-combined-ca-bundle\") pod \"placement-db-sync-cjjlj\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.927011 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d12e79-44d3-4b3a-bd17-af547a42fc19-combined-ca-bundle\") pod \"placement-db-sync-cjjlj\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.931838 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10d12e79-44d3-4b3a-bd17-af547a42fc19-logs\") pod \"placement-db-sync-cjjlj\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.932461 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d12e79-44d3-4b3a-bd17-af547a42fc19-scripts\") pod \"placement-db-sync-cjjlj\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.934571 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d12e79-44d3-4b3a-bd17-af547a42fc19-config-data\") pod \"placement-db-sync-cjjlj\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.953687 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srtdk\" (UniqueName: \"kubernetes.io/projected/10d12e79-44d3-4b3a-bd17-af547a42fc19-kube-api-access-srtdk\") pod \"placement-db-sync-cjjlj\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.953769 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-sw8rd"] Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.973922 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-9j6xx" Dec 16 15:15:01 crc kubenswrapper[4775]: I1216 15:15:01.982955 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-j4mx8"] Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.000831 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.003152 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.024110 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-2nctf"] Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.025531 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-run-httpd\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.025609 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-scripts\") pod \"cinder-db-sync-j4mx8\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.025638 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de018e03-657b-4eec-8b94-2d305f9bdbcf-db-sync-config-data\") pod \"barbican-db-sync-sw8rd\" (UID: \"de018e03-657b-4eec-8b94-2d305f9bdbcf\") " pod="openstack/barbican-db-sync-sw8rd" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.025663 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-scripts\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.025706 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-config-data\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.025725 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t67wj\" (UniqueName: \"kubernetes.io/projected/de018e03-657b-4eec-8b94-2d305f9bdbcf-kube-api-access-t67wj\") pod \"barbican-db-sync-sw8rd\" (UID: \"de018e03-657b-4eec-8b94-2d305f9bdbcf\") " pod="openstack/barbican-db-sync-sw8rd" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.025786 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-combined-ca-bundle\") pod \"cinder-db-sync-j4mx8\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.025810 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vlnk\" (UniqueName: \"kubernetes.io/projected/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-kube-api-access-4vlnk\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.025853 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.025942 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-db-sync-config-data\") pod \"cinder-db-sync-j4mx8\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.025968 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-log-httpd\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.026061 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-etc-machine-id\") pod \"cinder-db-sync-j4mx8\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.026085 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de018e03-657b-4eec-8b94-2d305f9bdbcf-combined-ca-bundle\") pod \"barbican-db-sync-sw8rd\" (UID: \"de018e03-657b-4eec-8b94-2d305f9bdbcf\") " pod="openstack/barbican-db-sync-sw8rd" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.026150 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82dqt\" (UniqueName: \"kubernetes.io/projected/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-kube-api-access-82dqt\") pod \"cinder-db-sync-j4mx8\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.033027 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.033061 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-config-data\") pod \"cinder-db-sync-j4mx8\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.034069 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.035164 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.043203 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.103465 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-5dkn9"] Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.105693 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5dkn9" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.110685 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.111060 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-f4gmr" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.110815 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.130106 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-nl6bm"] Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.135263 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bffgg\" (UniqueName: \"kubernetes.io/projected/63559e2e-2493-4b6f-b5d2-1573af2e9ac6-kube-api-access-bffgg\") pod \"neutron-db-sync-5dkn9\" (UID: \"63559e2e-2493-4b6f-b5d2-1573af2e9ac6\") " pod="openstack/neutron-db-sync-5dkn9" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.135366 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-etc-machine-id\") pod \"cinder-db-sync-j4mx8\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.135386 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de018e03-657b-4eec-8b94-2d305f9bdbcf-combined-ca-bundle\") pod \"barbican-db-sync-sw8rd\" (UID: \"de018e03-657b-4eec-8b94-2d305f9bdbcf\") " pod="openstack/barbican-db-sync-sw8rd" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.135432 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/63559e2e-2493-4b6f-b5d2-1573af2e9ac6-config\") pod \"neutron-db-sync-5dkn9\" (UID: \"63559e2e-2493-4b6f-b5d2-1573af2e9ac6\") " pod="openstack/neutron-db-sync-5dkn9" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.135468 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82dqt\" (UniqueName: \"kubernetes.io/projected/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-kube-api-access-82dqt\") pod \"cinder-db-sync-j4mx8\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.135614 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.135634 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-config-data\") pod \"cinder-db-sync-j4mx8\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.135758 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-run-httpd\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.135780 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-scripts\") pod \"cinder-db-sync-j4mx8\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.135772 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-etc-machine-id\") pod \"cinder-db-sync-j4mx8\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.135799 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de018e03-657b-4eec-8b94-2d305f9bdbcf-db-sync-config-data\") pod \"barbican-db-sync-sw8rd\" (UID: \"de018e03-657b-4eec-8b94-2d305f9bdbcf\") " pod="openstack/barbican-db-sync-sw8rd" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.135993 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-scripts\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.136026 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63559e2e-2493-4b6f-b5d2-1573af2e9ac6-combined-ca-bundle\") pod \"neutron-db-sync-5dkn9\" (UID: \"63559e2e-2493-4b6f-b5d2-1573af2e9ac6\") " pod="openstack/neutron-db-sync-5dkn9" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.136042 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-config-data\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.136164 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t67wj\" (UniqueName: \"kubernetes.io/projected/de018e03-657b-4eec-8b94-2d305f9bdbcf-kube-api-access-t67wj\") pod \"barbican-db-sync-sw8rd\" (UID: \"de018e03-657b-4eec-8b94-2d305f9bdbcf\") " pod="openstack/barbican-db-sync-sw8rd" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.136198 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-combined-ca-bundle\") pod \"cinder-db-sync-j4mx8\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.136404 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vlnk\" (UniqueName: \"kubernetes.io/projected/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-kube-api-access-4vlnk\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.136566 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.136609 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-db-sync-config-data\") pod \"cinder-db-sync-j4mx8\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.136813 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-log-httpd\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.138096 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.139081 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-log-httpd\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.139696 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-run-httpd\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.145476 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-scripts\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.153681 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de018e03-657b-4eec-8b94-2d305f9bdbcf-combined-ca-bundle\") pod \"barbican-db-sync-sw8rd\" (UID: \"de018e03-657b-4eec-8b94-2d305f9bdbcf\") " pod="openstack/barbican-db-sync-sw8rd" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.165831 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de018e03-657b-4eec-8b94-2d305f9bdbcf-db-sync-config-data\") pod \"barbican-db-sync-sw8rd\" (UID: \"de018e03-657b-4eec-8b94-2d305f9bdbcf\") " pod="openstack/barbican-db-sync-sw8rd" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.165972 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-5dkn9"] Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.166213 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-config-data\") pod \"cinder-db-sync-j4mx8\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.166834 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-config-data\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.168608 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-scripts\") pod \"cinder-db-sync-j4mx8\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.172993 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-nl6bm"] Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.174319 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-combined-ca-bundle\") pod \"cinder-db-sync-j4mx8\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.194098 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm" podStartSLOduration=2.194071288 podStartE2EDuration="2.194071288s" podCreationTimestamp="2025-12-16 15:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:01.919770923 +0000 UTC m=+1226.870849866" watchObservedRunningTime="2025-12-16 15:15:02.194071288 +0000 UTC m=+1227.145150211" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.209036 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vlnk\" (UniqueName: \"kubernetes.io/projected/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-kube-api-access-4vlnk\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.209176 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-db-sync-config-data\") pod \"cinder-db-sync-j4mx8\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.209599 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82dqt\" (UniqueName: \"kubernetes.io/projected/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-kube-api-access-82dqt\") pod \"cinder-db-sync-j4mx8\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.213690 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.215834 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t67wj\" (UniqueName: \"kubernetes.io/projected/de018e03-657b-4eec-8b94-2d305f9bdbcf-kube-api-access-t67wj\") pod \"barbican-db-sync-sw8rd\" (UID: \"de018e03-657b-4eec-8b94-2d305f9bdbcf\") " pod="openstack/barbican-db-sync-sw8rd" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.234472 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.236994 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-s9dpj"] Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.238369 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-nl6bm\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.238423 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/63559e2e-2493-4b6f-b5d2-1573af2e9ac6-config\") pod \"neutron-db-sync-5dkn9\" (UID: \"63559e2e-2493-4b6f-b5d2-1573af2e9ac6\") " pod="openstack/neutron-db-sync-5dkn9" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.238474 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-nl6bm\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.238529 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-nl6bm\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.238967 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-nl6bm\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.239040 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63559e2e-2493-4b6f-b5d2-1573af2e9ac6-combined-ca-bundle\") pod \"neutron-db-sync-5dkn9\" (UID: \"63559e2e-2493-4b6f-b5d2-1573af2e9ac6\") " pod="openstack/neutron-db-sync-5dkn9" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.239165 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bffgg\" (UniqueName: \"kubernetes.io/projected/63559e2e-2493-4b6f-b5d2-1573af2e9ac6-kube-api-access-bffgg\") pod \"neutron-db-sync-5dkn9\" (UID: \"63559e2e-2493-4b6f-b5d2-1573af2e9ac6\") " pod="openstack/neutron-db-sync-5dkn9" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.239194 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pg2c\" (UniqueName: \"kubernetes.io/projected/21b4aaed-0663-4480-8baa-4311a7aa5278-kube-api-access-5pg2c\") pod \"dnsmasq-dns-785d8bcb8c-nl6bm\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.239242 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-config\") pod \"dnsmasq-dns-785d8bcb8c-nl6bm\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.246160 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/63559e2e-2493-4b6f-b5d2-1573af2e9ac6-config\") pod \"neutron-db-sync-5dkn9\" (UID: \"63559e2e-2493-4b6f-b5d2-1573af2e9ac6\") " pod="openstack/neutron-db-sync-5dkn9" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.247509 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.249086 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63559e2e-2493-4b6f-b5d2-1573af2e9ac6-combined-ca-bundle\") pod \"neutron-db-sync-5dkn9\" (UID: \"63559e2e-2493-4b6f-b5d2-1573af2e9ac6\") " pod="openstack/neutron-db-sync-5dkn9" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.261049 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-s9dpj"] Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.266491 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bffgg\" (UniqueName: \"kubernetes.io/projected/63559e2e-2493-4b6f-b5d2-1573af2e9ac6-kube-api-access-bffgg\") pod \"neutron-db-sync-5dkn9\" (UID: \"63559e2e-2493-4b6f-b5d2-1573af2e9ac6\") " pod="openstack/neutron-db-sync-5dkn9" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.336766 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sw8rd" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.344408 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pg2c\" (UniqueName: \"kubernetes.io/projected/21b4aaed-0663-4480-8baa-4311a7aa5278-kube-api-access-5pg2c\") pod \"dnsmasq-dns-785d8bcb8c-nl6bm\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.344507 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-config\") pod \"dnsmasq-dns-785d8bcb8c-nl6bm\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.344617 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-nl6bm\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.344676 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-nl6bm\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.344738 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-nl6bm\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.344777 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-nl6bm\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.346166 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-config\") pod \"dnsmasq-dns-785d8bcb8c-nl6bm\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.346213 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-nl6bm\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.346499 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-nl6bm\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.346784 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-nl6bm\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.348372 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-nl6bm\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.370352 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pg2c\" (UniqueName: \"kubernetes.io/projected/21b4aaed-0663-4480-8baa-4311a7aa5278-kube-api-access-5pg2c\") pod \"dnsmasq-dns-785d8bcb8c-nl6bm\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.386618 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.410227 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.413914 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.418119 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.422302 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lzhbs" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.425907 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.426816 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.451220 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.531550 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5dkn9" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.539302 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bw8mh"] Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.548551 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-2nctf"] Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.560777 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/483d64ac-d722-4667-b836-88cf464097e7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.561759 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.562013 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/483d64ac-d722-4667-b836-88cf464097e7-scripts\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.562124 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483d64ac-d722-4667-b836-88cf464097e7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.562200 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/483d64ac-d722-4667-b836-88cf464097e7-logs\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.562299 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483d64ac-d722-4667-b836-88cf464097e7-config-data\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.562487 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs5gg\" (UniqueName: \"kubernetes.io/projected/483d64ac-d722-4667-b836-88cf464097e7-kube-api-access-rs5gg\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.583708 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.662063 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-9j6xx"] Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.670225 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/483d64ac-d722-4667-b836-88cf464097e7-scripts\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.670326 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483d64ac-d722-4667-b836-88cf464097e7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.670383 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/483d64ac-d722-4667-b836-88cf464097e7-logs\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.670480 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483d64ac-d722-4667-b836-88cf464097e7-config-data\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.670637 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs5gg\" (UniqueName: \"kubernetes.io/projected/483d64ac-d722-4667-b836-88cf464097e7-kube-api-access-rs5gg\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.670706 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/483d64ac-d722-4667-b836-88cf464097e7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.670746 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.671064 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.675541 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/483d64ac-d722-4667-b836-88cf464097e7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.681042 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483d64ac-d722-4667-b836-88cf464097e7-config-data\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.690708 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/483d64ac-d722-4667-b836-88cf464097e7-scripts\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.694845 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.696887 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.702589 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/483d64ac-d722-4667-b836-88cf464097e7-logs\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.725525 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.728481 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs5gg\" (UniqueName: \"kubernetes.io/projected/483d64ac-d722-4667-b836-88cf464097e7-kube-api-access-rs5gg\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.742353 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.750591 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483d64ac-d722-4667-b836-88cf464097e7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.750671 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.773380 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.775814 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829c820-6674-4823-a409-2dad02cffe7b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.775882 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7829c820-6674-4823-a409-2dad02cffe7b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.777456 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7829c820-6674-4823-a409-2dad02cffe7b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.777578 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7829c820-6674-4823-a409-2dad02cffe7b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.777617 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h5xn\" (UniqueName: \"kubernetes.io/projected/7829c820-6674-4823-a409-2dad02cffe7b-kube-api-access-7h5xn\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.777711 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7829c820-6674-4823-a409-2dad02cffe7b-logs\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.777744 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.844545 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-cjjlj"] Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.879608 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7829c820-6674-4823-a409-2dad02cffe7b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.879666 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h5xn\" (UniqueName: \"kubernetes.io/projected/7829c820-6674-4823-a409-2dad02cffe7b-kube-api-access-7h5xn\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.879721 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7829c820-6674-4823-a409-2dad02cffe7b-logs\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.879751 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.879809 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829c820-6674-4823-a409-2dad02cffe7b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.879850 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7829c820-6674-4823-a409-2dad02cffe7b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.879935 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7829c820-6674-4823-a409-2dad02cffe7b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.881074 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.881120 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.881840 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7829c820-6674-4823-a409-2dad02cffe7b-logs\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.882550 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7829c820-6674-4823-a409-2dad02cffe7b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.883208 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.887369 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7829c820-6674-4823-a409-2dad02cffe7b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.888645 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7829c820-6674-4823-a409-2dad02cffe7b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.891029 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829c820-6674-4823-a409-2dad02cffe7b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.916212 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h5xn\" (UniqueName: \"kubernetes.io/projected/7829c820-6674-4823-a409-2dad02cffe7b-kube-api-access-7h5xn\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.939133 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-9j6xx" event={"ID":"23611da1-3f26-42c4-bd23-36e0b04bdc24","Type":"ContainerStarted","Data":"91ad8970bd6e6ac9d0d860ef21f42bffca556f0e4a301894573601c9f96eeae1"} Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.943832 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.946211 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cjjlj" event={"ID":"10d12e79-44d3-4b3a-bd17-af547a42fc19","Type":"ContainerStarted","Data":"165f3200ee8c91fec26023dc84a8a167d3257687b5141cd508f126501cbdabd2"} Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.964313 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" event={"ID":"6064a5c8-9b4f-44bc-8041-7d5972634060","Type":"ContainerStarted","Data":"f64b85e5b7eb362744251e7ff8f99aa448035ce7fac66c44c09ef5787cee2f37"} Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.970448 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-2nctf" event={"ID":"4295fd10-669f-4d0f-81bd-8cf04c0d4704","Type":"ContainerStarted","Data":"684b9077a0f73edefc40be15fc88850881d9662b68825c67b6ff7886c1ce3cff"} Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.973290 4775 generic.go:334] "Generic (PLEG): container finished" podID="ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96" containerID="95c74dbb0eb319a6f6b01a923daefdf384e25eab4dbd3b122d18d3d5fdf16d24" exitCode=0 Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.973408 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm" event={"ID":"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96","Type":"ContainerDied","Data":"95c74dbb0eb319a6f6b01a923daefdf384e25eab4dbd3b122d18d3d5fdf16d24"} Dec 16 15:15:02 crc kubenswrapper[4775]: I1216 15:15:02.979826 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bw8mh" event={"ID":"652d1ed1-db2e-48c7-8409-69047378a6fe","Type":"ContainerStarted","Data":"ac51410385efb8eabacece1416f0e16d19dcb438ccab677531c1e7f5e048f7e7"} Dec 16 15:15:03 crc kubenswrapper[4775]: I1216 15:15:03.189348 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-j4mx8"] Dec 16 15:15:03 crc kubenswrapper[4775]: I1216 15:15:03.201295 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-sw8rd"] Dec 16 15:15:03 crc kubenswrapper[4775]: I1216 15:15:03.225140 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:03 crc kubenswrapper[4775]: I1216 15:15:03.358493 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b1762cf-2e5e-433b-be23-2e3fe5433c4b" path="/var/lib/kubelet/pods/9b1762cf-2e5e-433b-be23-2e3fe5433c4b/volumes" Dec 16 15:15:03 crc kubenswrapper[4775]: I1216 15:15:03.384813 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:15:03 crc kubenswrapper[4775]: I1216 15:15:03.404769 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-nl6bm"] Dec 16 15:15:03 crc kubenswrapper[4775]: W1216 15:15:03.594101 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63559e2e_2493_4b6f_b5d2_1573af2e9ac6.slice/crio-07ddf9d7a409d0ba3a4860c11b501cae8468e05409119273ebc937e7a12d9491 WatchSource:0}: Error finding container 07ddf9d7a409d0ba3a4860c11b501cae8468e05409119273ebc937e7a12d9491: Status 404 returned error can't find the container with id 07ddf9d7a409d0ba3a4860c11b501cae8468e05409119273ebc937e7a12d9491 Dec 16 15:15:03 crc kubenswrapper[4775]: I1216 15:15:03.608548 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-5dkn9"] Dec 16 15:15:03 crc kubenswrapper[4775]: I1216 15:15:03.726610 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:15:03 crc kubenswrapper[4775]: I1216 15:15:03.997904 4775 generic.go:334] "Generic (PLEG): container finished" podID="4295fd10-669f-4d0f-81bd-8cf04c0d4704" containerID="aa76d920b93b71892d0951b70d8dfcc1428223cbff88c315fd4b60165ec56f6c" exitCode=0 Dec 16 15:15:04 crc kubenswrapper[4775]: I1216 15:15:03.998029 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-2nctf" event={"ID":"4295fd10-669f-4d0f-81bd-8cf04c0d4704","Type":"ContainerDied","Data":"aa76d920b93b71892d0951b70d8dfcc1428223cbff88c315fd4b60165ec56f6c"} Dec 16 15:15:04 crc kubenswrapper[4775]: I1216 15:15:04.027782 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b","Type":"ContainerStarted","Data":"1a90c21a17521237f5bb9900d3cf2c8187c6699b114ca8912efdd100cf43bc3a"} Dec 16 15:15:04 crc kubenswrapper[4775]: I1216 15:15:04.039730 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"483d64ac-d722-4667-b836-88cf464097e7","Type":"ContainerStarted","Data":"84eaa5d7fb53c1a567b83012f11e4ba5241e842339a8145994b3d8bf3e411d32"} Dec 16 15:15:04 crc kubenswrapper[4775]: I1216 15:15:04.046489 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" event={"ID":"6064a5c8-9b4f-44bc-8041-7d5972634060","Type":"ContainerDied","Data":"f64b85e5b7eb362744251e7ff8f99aa448035ce7fac66c44c09ef5787cee2f37"} Dec 16 15:15:04 crc kubenswrapper[4775]: I1216 15:15:04.046560 4775 generic.go:334] "Generic (PLEG): container finished" podID="6064a5c8-9b4f-44bc-8041-7d5972634060" containerID="f64b85e5b7eb362744251e7ff8f99aa448035ce7fac66c44c09ef5787cee2f37" exitCode=0 Dec 16 15:15:04 crc kubenswrapper[4775]: I1216 15:15:04.049481 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j4mx8" event={"ID":"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207","Type":"ContainerStarted","Data":"efeebc81d4ebaa959f273509057f99e8dc0db1be698af1c35836702c098eacb5"} Dec 16 15:15:04 crc kubenswrapper[4775]: I1216 15:15:04.053126 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bw8mh" event={"ID":"652d1ed1-db2e-48c7-8409-69047378a6fe","Type":"ContainerStarted","Data":"d5e60c2d53aec878da10d04765a76a1e18632d81605d744e1e12e0afc666286f"} Dec 16 15:15:04 crc kubenswrapper[4775]: I1216 15:15:04.063558 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5dkn9" event={"ID":"63559e2e-2493-4b6f-b5d2-1573af2e9ac6","Type":"ContainerStarted","Data":"07ddf9d7a409d0ba3a4860c11b501cae8468e05409119273ebc937e7a12d9491"} Dec 16 15:15:04 crc kubenswrapper[4775]: I1216 15:15:04.071570 4775 generic.go:334] "Generic (PLEG): container finished" podID="21b4aaed-0663-4480-8baa-4311a7aa5278" containerID="a6b4ed58d6c92aa964be0edf745d5099f661a8b9a396a7f466a7b0d2ca0e6b42" exitCode=0 Dec 16 15:15:04 crc kubenswrapper[4775]: I1216 15:15:04.071772 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" event={"ID":"21b4aaed-0663-4480-8baa-4311a7aa5278","Type":"ContainerDied","Data":"a6b4ed58d6c92aa964be0edf745d5099f661a8b9a396a7f466a7b0d2ca0e6b42"} Dec 16 15:15:04 crc kubenswrapper[4775]: I1216 15:15:04.071802 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" event={"ID":"21b4aaed-0663-4480-8baa-4311a7aa5278","Type":"ContainerStarted","Data":"39dd5abe2ff0b1a163295bd37407a4627cf8e05852c4c3e15b998aff1f0f5430"} Dec 16 15:15:04 crc kubenswrapper[4775]: I1216 15:15:04.082510 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sw8rd" event={"ID":"de018e03-657b-4eec-8b94-2d305f9bdbcf","Type":"ContainerStarted","Data":"14fa8132abb7910585df2d71bd2b1bb9ddb0f16440b810c9aa5b09f78ff373cc"} Dec 16 15:15:04 crc kubenswrapper[4775]: I1216 15:15:04.133441 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bw8mh" podStartSLOduration=3.133420905 podStartE2EDuration="3.133420905s" podCreationTimestamp="2025-12-16 15:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:04.109259664 +0000 UTC m=+1229.060338607" watchObservedRunningTime="2025-12-16 15:15:04.133420905 +0000 UTC m=+1229.084499838" Dec 16 15:15:04 crc kubenswrapper[4775]: I1216 15:15:04.302536 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:15:04 crc kubenswrapper[4775]: I1216 15:15:04.424122 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:15:04 crc kubenswrapper[4775]: I1216 15:15:04.586435 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:15:04 crc kubenswrapper[4775]: I1216 15:15:04.656642 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:15:04 crc kubenswrapper[4775]: W1216 15:15:04.660608 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7829c820_6674_4823_a409_2dad02cffe7b.slice/crio-e417aa4abbef10cbd6c9af051b8b69fcea118cd0b264567cbb655e6520f44981 WatchSource:0}: Error finding container e417aa4abbef10cbd6c9af051b8b69fcea118cd0b264567cbb655e6520f44981: Status 404 returned error can't find the container with id e417aa4abbef10cbd6c9af051b8b69fcea118cd0b264567cbb655e6520f44981 Dec 16 15:15:04 crc kubenswrapper[4775]: I1216 15:15:04.951370 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.066625 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-dns-svc\") pod \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.066690 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjnnj\" (UniqueName: \"kubernetes.io/projected/4295fd10-669f-4d0f-81bd-8cf04c0d4704-kube-api-access-zjnnj\") pod \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.066821 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-dns-swift-storage-0\") pod \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.066843 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-config\") pod \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.066951 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-ovsdbserver-sb\") pod \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.068110 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-ovsdbserver-nb\") pod \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\" (UID: \"4295fd10-669f-4d0f-81bd-8cf04c0d4704\") " Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.084282 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4295fd10-669f-4d0f-81bd-8cf04c0d4704-kube-api-access-zjnnj" (OuterVolumeSpecName: "kube-api-access-zjnnj") pod "4295fd10-669f-4d0f-81bd-8cf04c0d4704" (UID: "4295fd10-669f-4d0f-81bd-8cf04c0d4704"). InnerVolumeSpecName "kube-api-access-zjnnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.087917 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.103076 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.127496 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm" event={"ID":"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96","Type":"ContainerDied","Data":"486732d40e8563ba35f5a355f852dcd17ac223c450464cc53d8ba5d8b60b8971"} Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.127552 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="486732d40e8563ba35f5a355f852dcd17ac223c450464cc53d8ba5d8b60b8971" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.127650 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.142812 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4295fd10-669f-4d0f-81bd-8cf04c0d4704" (UID: "4295fd10-669f-4d0f-81bd-8cf04c0d4704"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.146116 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7829c820-6674-4823-a409-2dad02cffe7b","Type":"ContainerStarted","Data":"e417aa4abbef10cbd6c9af051b8b69fcea118cd0b264567cbb655e6520f44981"} Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.150252 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5dkn9" event={"ID":"63559e2e-2493-4b6f-b5d2-1573af2e9ac6","Type":"ContainerStarted","Data":"0bfc0a03cc9ec791094f058889bcec04dcf0c29ed44680afd7d2ee54622c2e0c"} Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.157116 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-config" (OuterVolumeSpecName: "config") pod "4295fd10-669f-4d0f-81bd-8cf04c0d4704" (UID: "4295fd10-669f-4d0f-81bd-8cf04c0d4704"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.158494 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4295fd10-669f-4d0f-81bd-8cf04c0d4704" (UID: "4295fd10-669f-4d0f-81bd-8cf04c0d4704"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.170089 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-ovsdbserver-sb\") pod \"6064a5c8-9b4f-44bc-8041-7d5972634060\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.170253 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96-config-volume\") pod \"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96\" (UID: \"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96\") " Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.170339 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-ovsdbserver-nb\") pod \"6064a5c8-9b4f-44bc-8041-7d5972634060\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.170385 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-dns-svc\") pod \"6064a5c8-9b4f-44bc-8041-7d5972634060\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.170422 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-config\") pod \"6064a5c8-9b4f-44bc-8041-7d5972634060\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.170482 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-946lv\" (UniqueName: \"kubernetes.io/projected/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96-kube-api-access-946lv\") pod \"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96\" (UID: \"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96\") " Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.170591 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96-secret-volume\") pod \"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96\" (UID: \"ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96\") " Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.170645 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv9bj\" (UniqueName: \"kubernetes.io/projected/6064a5c8-9b4f-44bc-8041-7d5972634060-kube-api-access-qv9bj\") pod \"6064a5c8-9b4f-44bc-8041-7d5972634060\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.170721 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-dns-swift-storage-0\") pod \"6064a5c8-9b4f-44bc-8041-7d5972634060\" (UID: \"6064a5c8-9b4f-44bc-8041-7d5972634060\") " Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.171350 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.171369 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjnnj\" (UniqueName: \"kubernetes.io/projected/4295fd10-669f-4d0f-81bd-8cf04c0d4704-kube-api-access-zjnnj\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.171384 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.171397 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.172482 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" event={"ID":"6064a5c8-9b4f-44bc-8041-7d5972634060","Type":"ContainerDied","Data":"e41e3b3f78cb32bb94cb214907d00642d13e52ff229624e66572a0cc0579fe64"} Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.172671 4775 scope.go:117] "RemoveContainer" containerID="f64b85e5b7eb362744251e7ff8f99aa448035ce7fac66c44c09ef5787cee2f37" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.172870 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-j6sdm" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.174361 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96-config-volume" (OuterVolumeSpecName: "config-volume") pod "ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96" (UID: "ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.182181 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6064a5c8-9b4f-44bc-8041-7d5972634060-kube-api-access-qv9bj" (OuterVolumeSpecName: "kube-api-access-qv9bj") pod "6064a5c8-9b4f-44bc-8041-7d5972634060" (UID: "6064a5c8-9b4f-44bc-8041-7d5972634060"). InnerVolumeSpecName "kube-api-access-qv9bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.189123 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96" (UID: "ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.191710 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96-kube-api-access-946lv" (OuterVolumeSpecName: "kube-api-access-946lv") pod "ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96" (UID: "ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96"). InnerVolumeSpecName "kube-api-access-946lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.192634 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-5dkn9" podStartSLOduration=4.192610931 podStartE2EDuration="4.192610931s" podCreationTimestamp="2025-12-16 15:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:05.178205857 +0000 UTC m=+1230.129284800" watchObservedRunningTime="2025-12-16 15:15:05.192610931 +0000 UTC m=+1230.143689854" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.199301 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-2nctf" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.200064 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-2nctf" event={"ID":"4295fd10-669f-4d0f-81bd-8cf04c0d4704","Type":"ContainerDied","Data":"684b9077a0f73edefc40be15fc88850881d9662b68825c67b6ff7886c1ce3cff"} Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.274227 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-946lv\" (UniqueName: \"kubernetes.io/projected/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96-kube-api-access-946lv\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.274298 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.274313 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv9bj\" (UniqueName: \"kubernetes.io/projected/6064a5c8-9b4f-44bc-8041-7d5972634060-kube-api-access-qv9bj\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.274325 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.336896 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6064a5c8-9b4f-44bc-8041-7d5972634060" (UID: "6064a5c8-9b4f-44bc-8041-7d5972634060"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.376263 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-config" (OuterVolumeSpecName: "config") pod "6064a5c8-9b4f-44bc-8041-7d5972634060" (UID: "6064a5c8-9b4f-44bc-8041-7d5972634060"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.376817 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.376853 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.382986 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6064a5c8-9b4f-44bc-8041-7d5972634060" (UID: "6064a5c8-9b4f-44bc-8041-7d5972634060"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.385863 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4295fd10-669f-4d0f-81bd-8cf04c0d4704" (UID: "4295fd10-669f-4d0f-81bd-8cf04c0d4704"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.388674 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4295fd10-669f-4d0f-81bd-8cf04c0d4704" (UID: "4295fd10-669f-4d0f-81bd-8cf04c0d4704"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.389104 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6064a5c8-9b4f-44bc-8041-7d5972634060" (UID: "6064a5c8-9b4f-44bc-8041-7d5972634060"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.389259 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6064a5c8-9b4f-44bc-8041-7d5972634060" (UID: "6064a5c8-9b4f-44bc-8041-7d5972634060"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.484110 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.484151 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4295fd10-669f-4d0f-81bd-8cf04c0d4704-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.484166 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.484175 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.484188 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6064a5c8-9b4f-44bc-8041-7d5972634060-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.553199 4775 scope.go:117] "RemoveContainer" containerID="aa76d920b93b71892d0951b70d8dfcc1428223cbff88c315fd4b60165ec56f6c" Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.651066 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-2nctf"] Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.674094 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-2nctf"] Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.707206 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-j6sdm"] Dec 16 15:15:05 crc kubenswrapper[4775]: I1216 15:15:05.716271 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-j6sdm"] Dec 16 15:15:06 crc kubenswrapper[4775]: I1216 15:15:06.237685 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"483d64ac-d722-4667-b836-88cf464097e7","Type":"ContainerStarted","Data":"6915dea9eb238f468a945cd90498da568a8acc83e22750fe1c8aee67b14c404d"} Dec 16 15:15:06 crc kubenswrapper[4775]: I1216 15:15:06.243584 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" event={"ID":"21b4aaed-0663-4480-8baa-4311a7aa5278","Type":"ContainerStarted","Data":"e88c0d7c7c40a11b03741c5d9be899580f2c590d3430272a5da08805416f43d7"} Dec 16 15:15:06 crc kubenswrapper[4775]: I1216 15:15:06.243940 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:06 crc kubenswrapper[4775]: I1216 15:15:06.284783 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" podStartSLOduration=5.284762695 podStartE2EDuration="5.284762695s" podCreationTimestamp="2025-12-16 15:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:06.280564952 +0000 UTC m=+1231.231643885" watchObservedRunningTime="2025-12-16 15:15:06.284762695 +0000 UTC m=+1231.235841618" Dec 16 15:15:07 crc kubenswrapper[4775]: I1216 15:15:07.353486 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4295fd10-669f-4d0f-81bd-8cf04c0d4704" path="/var/lib/kubelet/pods/4295fd10-669f-4d0f-81bd-8cf04c0d4704/volumes" Dec 16 15:15:07 crc kubenswrapper[4775]: I1216 15:15:07.354698 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6064a5c8-9b4f-44bc-8041-7d5972634060" path="/var/lib/kubelet/pods/6064a5c8-9b4f-44bc-8041-7d5972634060/volumes" Dec 16 15:15:12 crc kubenswrapper[4775]: I1216 15:15:12.585092 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:12 crc kubenswrapper[4775]: I1216 15:15:12.638666 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qhnq6"] Dec 16 15:15:12 crc kubenswrapper[4775]: I1216 15:15:12.640000 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-qhnq6" podUID="0428ec83-659c-47e6-8b58-385b582e628e" containerName="dnsmasq-dns" containerID="cri-o://c1ddbf24df25553da481fc30024757f9e28643e73a7df6cc7ab6e64fd7f8faf1" gracePeriod=10 Dec 16 15:15:12 crc kubenswrapper[4775]: I1216 15:15:12.660610 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-qhnq6" podUID="0428ec83-659c-47e6-8b58-385b582e628e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Dec 16 15:15:13 crc kubenswrapper[4775]: I1216 15:15:13.332233 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"483d64ac-d722-4667-b836-88cf464097e7","Type":"ContainerStarted","Data":"f0d520922de754f4d49d688dffef94c2622c098799d6ef7fb54efb1d6f294eb9"} Dec 16 15:15:13 crc kubenswrapper[4775]: I1216 15:15:13.335195 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7829c820-6674-4823-a409-2dad02cffe7b","Type":"ContainerStarted","Data":"fe794b90b9faf3f59c87057ca9919d556ad0cedd8e3a756fca5550c4d7873bd2"} Dec 16 15:15:14 crc kubenswrapper[4775]: I1216 15:15:14.352803 4775 generic.go:334] "Generic (PLEG): container finished" podID="0428ec83-659c-47e6-8b58-385b582e628e" containerID="c1ddbf24df25553da481fc30024757f9e28643e73a7df6cc7ab6e64fd7f8faf1" exitCode=0 Dec 16 15:15:14 crc kubenswrapper[4775]: I1216 15:15:14.353493 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="483d64ac-d722-4667-b836-88cf464097e7" containerName="glance-log" containerID="cri-o://6915dea9eb238f468a945cd90498da568a8acc83e22750fe1c8aee67b14c404d" gracePeriod=30 Dec 16 15:15:14 crc kubenswrapper[4775]: I1216 15:15:14.353836 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qhnq6" event={"ID":"0428ec83-659c-47e6-8b58-385b582e628e","Type":"ContainerDied","Data":"c1ddbf24df25553da481fc30024757f9e28643e73a7df6cc7ab6e64fd7f8faf1"} Dec 16 15:15:14 crc kubenswrapper[4775]: I1216 15:15:14.354257 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="483d64ac-d722-4667-b836-88cf464097e7" containerName="glance-httpd" containerID="cri-o://f0d520922de754f4d49d688dffef94c2622c098799d6ef7fb54efb1d6f294eb9" gracePeriod=30 Dec 16 15:15:15 crc kubenswrapper[4775]: I1216 15:15:15.363158 4775 generic.go:334] "Generic (PLEG): container finished" podID="483d64ac-d722-4667-b836-88cf464097e7" containerID="f0d520922de754f4d49d688dffef94c2622c098799d6ef7fb54efb1d6f294eb9" exitCode=0 Dec 16 15:15:15 crc kubenswrapper[4775]: I1216 15:15:15.363462 4775 generic.go:334] "Generic (PLEG): container finished" podID="483d64ac-d722-4667-b836-88cf464097e7" containerID="6915dea9eb238f468a945cd90498da568a8acc83e22750fe1c8aee67b14c404d" exitCode=143 Dec 16 15:15:15 crc kubenswrapper[4775]: I1216 15:15:15.363232 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"483d64ac-d722-4667-b836-88cf464097e7","Type":"ContainerDied","Data":"f0d520922de754f4d49d688dffef94c2622c098799d6ef7fb54efb1d6f294eb9"} Dec 16 15:15:15 crc kubenswrapper[4775]: I1216 15:15:15.363520 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"483d64ac-d722-4667-b836-88cf464097e7","Type":"ContainerDied","Data":"6915dea9eb238f468a945cd90498da568a8acc83e22750fe1c8aee67b14c404d"} Dec 16 15:15:15 crc kubenswrapper[4775]: I1216 15:15:15.364946 4775 generic.go:334] "Generic (PLEG): container finished" podID="652d1ed1-db2e-48c7-8409-69047378a6fe" containerID="d5e60c2d53aec878da10d04765a76a1e18632d81605d744e1e12e0afc666286f" exitCode=0 Dec 16 15:15:15 crc kubenswrapper[4775]: I1216 15:15:15.364978 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bw8mh" event={"ID":"652d1ed1-db2e-48c7-8409-69047378a6fe","Type":"ContainerDied","Data":"d5e60c2d53aec878da10d04765a76a1e18632d81605d744e1e12e0afc666286f"} Dec 16 15:15:15 crc kubenswrapper[4775]: I1216 15:15:15.391267 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=14.391215117 podStartE2EDuration="14.391215117s" podCreationTimestamp="2025-12-16 15:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:14.378056322 +0000 UTC m=+1239.329135255" watchObservedRunningTime="2025-12-16 15:15:15.391215117 +0000 UTC m=+1240.342294040" Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.084213 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.257316 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-dns-svc\") pod \"0428ec83-659c-47e6-8b58-385b582e628e\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.257387 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-ovsdbserver-sb\") pod \"0428ec83-659c-47e6-8b58-385b582e628e\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.257448 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfvkl\" (UniqueName: \"kubernetes.io/projected/0428ec83-659c-47e6-8b58-385b582e628e-kube-api-access-hfvkl\") pod \"0428ec83-659c-47e6-8b58-385b582e628e\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.257474 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-config\") pod \"0428ec83-659c-47e6-8b58-385b582e628e\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.257607 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-ovsdbserver-nb\") pod \"0428ec83-659c-47e6-8b58-385b582e628e\" (UID: \"0428ec83-659c-47e6-8b58-385b582e628e\") " Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.263859 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0428ec83-659c-47e6-8b58-385b582e628e-kube-api-access-hfvkl" (OuterVolumeSpecName: "kube-api-access-hfvkl") pod "0428ec83-659c-47e6-8b58-385b582e628e" (UID: "0428ec83-659c-47e6-8b58-385b582e628e"). InnerVolumeSpecName "kube-api-access-hfvkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.299998 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0428ec83-659c-47e6-8b58-385b582e628e" (UID: "0428ec83-659c-47e6-8b58-385b582e628e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.323116 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0428ec83-659c-47e6-8b58-385b582e628e" (UID: "0428ec83-659c-47e6-8b58-385b582e628e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.327097 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-config" (OuterVolumeSpecName: "config") pod "0428ec83-659c-47e6-8b58-385b582e628e" (UID: "0428ec83-659c-47e6-8b58-385b582e628e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.329015 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0428ec83-659c-47e6-8b58-385b582e628e" (UID: "0428ec83-659c-47e6-8b58-385b582e628e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.360388 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfvkl\" (UniqueName: \"kubernetes.io/projected/0428ec83-659c-47e6-8b58-385b582e628e-kube-api-access-hfvkl\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.360421 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.360432 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.360441 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.360449 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0428ec83-659c-47e6-8b58-385b582e628e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.378840 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qhnq6" event={"ID":"0428ec83-659c-47e6-8b58-385b582e628e","Type":"ContainerDied","Data":"17484b5a54b9b01d9b988858a0cc00b41311073aaee44d03d4da66d1fa1893a5"} Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.378953 4775 scope.go:117] "RemoveContainer" containerID="c1ddbf24df25553da481fc30024757f9e28643e73a7df6cc7ab6e64fd7f8faf1" Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.379870 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-qhnq6" Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.427539 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qhnq6"] Dec 16 15:15:16 crc kubenswrapper[4775]: I1216 15:15:16.432537 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qhnq6"] Dec 16 15:15:17 crc kubenswrapper[4775]: I1216 15:15:17.355285 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0428ec83-659c-47e6-8b58-385b582e628e" path="/var/lib/kubelet/pods/0428ec83-659c-47e6-8b58-385b582e628e/volumes" Dec 16 15:15:17 crc kubenswrapper[4775]: E1216 15:15:17.822303 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Dec 16 15:15:17 crc kubenswrapper[4775]: E1216 15:15:17.822466 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-srtdk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-cjjlj_openstack(10d12e79-44d3-4b3a-bd17-af547a42fc19): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:15:17 crc kubenswrapper[4775]: E1216 15:15:17.824629 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-cjjlj" podUID="10d12e79-44d3-4b3a-bd17-af547a42fc19" Dec 16 15:15:18 crc kubenswrapper[4775]: E1216 15:15:18.398569 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-cjjlj" podUID="10d12e79-44d3-4b3a-bd17-af547a42fc19" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.602062 4775 scope.go:117] "RemoveContainer" containerID="4f60592c97995273032227a56f3ae07cfcc75f667b62ba0ba41de4fcd53e1ec2" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.664431 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.774420 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.774462 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.800445 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-config-data\") pod \"652d1ed1-db2e-48c7-8409-69047378a6fe\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.800606 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-fernet-keys\") pod \"652d1ed1-db2e-48c7-8409-69047378a6fe\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.800818 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-scripts\") pod \"652d1ed1-db2e-48c7-8409-69047378a6fe\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.800856 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-credential-keys\") pod \"652d1ed1-db2e-48c7-8409-69047378a6fe\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.800914 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzn4c\" (UniqueName: \"kubernetes.io/projected/652d1ed1-db2e-48c7-8409-69047378a6fe-kube-api-access-xzn4c\") pod \"652d1ed1-db2e-48c7-8409-69047378a6fe\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.800959 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-combined-ca-bundle\") pod \"652d1ed1-db2e-48c7-8409-69047378a6fe\" (UID: \"652d1ed1-db2e-48c7-8409-69047378a6fe\") " Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.808929 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "652d1ed1-db2e-48c7-8409-69047378a6fe" (UID: "652d1ed1-db2e-48c7-8409-69047378a6fe"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.809476 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/652d1ed1-db2e-48c7-8409-69047378a6fe-kube-api-access-xzn4c" (OuterVolumeSpecName: "kube-api-access-xzn4c") pod "652d1ed1-db2e-48c7-8409-69047378a6fe" (UID: "652d1ed1-db2e-48c7-8409-69047378a6fe"). InnerVolumeSpecName "kube-api-access-xzn4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.825271 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-scripts" (OuterVolumeSpecName: "scripts") pod "652d1ed1-db2e-48c7-8409-69047378a6fe" (UID: "652d1ed1-db2e-48c7-8409-69047378a6fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.826703 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "652d1ed1-db2e-48c7-8409-69047378a6fe" (UID: "652d1ed1-db2e-48c7-8409-69047378a6fe"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.831297 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-config-data" (OuterVolumeSpecName: "config-data") pod "652d1ed1-db2e-48c7-8409-69047378a6fe" (UID: "652d1ed1-db2e-48c7-8409-69047378a6fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.834475 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "652d1ed1-db2e-48c7-8409-69047378a6fe" (UID: "652d1ed1-db2e-48c7-8409-69047378a6fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.868950 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.869009 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.869052 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.869708 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a2e1ee11401b69f007c34bf8d81d1b271d0b2639b666040ec08a76eb20c628c"} pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.869793 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" containerID="cri-o://1a2e1ee11401b69f007c34bf8d81d1b271d0b2639b666040ec08a76eb20c628c" gracePeriod=600 Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.902651 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.902909 4775 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.902921 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzn4c\" (UniqueName: \"kubernetes.io/projected/652d1ed1-db2e-48c7-8409-69047378a6fe-kube-api-access-xzn4c\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.902931 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.902942 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:32 crc kubenswrapper[4775]: I1216 15:15:32.902950 4775 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/652d1ed1-db2e-48c7-8409-69047378a6fe-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:33 crc kubenswrapper[4775]: E1216 15:15:33.200576 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 16 15:15:33 crc kubenswrapper[4775]: E1216 15:15:33.200759 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t67wj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-sw8rd_openstack(de018e03-657b-4eec-8b94-2d305f9bdbcf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:15:33 crc kubenswrapper[4775]: E1216 15:15:33.202025 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-sw8rd" podUID="de018e03-657b-4eec-8b94-2d305f9bdbcf" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.528865 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bw8mh" event={"ID":"652d1ed1-db2e-48c7-8409-69047378a6fe","Type":"ContainerDied","Data":"ac51410385efb8eabacece1416f0e16d19dcb438ccab677531c1e7f5e048f7e7"} Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.528931 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac51410385efb8eabacece1416f0e16d19dcb438ccab677531c1e7f5e048f7e7" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.529004 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bw8mh" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.535175 4775 generic.go:334] "Generic (PLEG): container finished" podID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerID="1a2e1ee11401b69f007c34bf8d81d1b271d0b2639b666040ec08a76eb20c628c" exitCode=0 Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.535936 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerDied","Data":"1a2e1ee11401b69f007c34bf8d81d1b271d0b2639b666040ec08a76eb20c628c"} Dec 16 15:15:33 crc kubenswrapper[4775]: E1216 15:15:33.536821 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-sw8rd" podUID="de018e03-657b-4eec-8b94-2d305f9bdbcf" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.748614 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bw8mh"] Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.757233 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bw8mh"] Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.852898 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vgjff"] Dec 16 15:15:33 crc kubenswrapper[4775]: E1216 15:15:33.853539 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6064a5c8-9b4f-44bc-8041-7d5972634060" containerName="init" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.853557 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6064a5c8-9b4f-44bc-8041-7d5972634060" containerName="init" Dec 16 15:15:33 crc kubenswrapper[4775]: E1216 15:15:33.853565 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4295fd10-669f-4d0f-81bd-8cf04c0d4704" containerName="init" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.853571 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4295fd10-669f-4d0f-81bd-8cf04c0d4704" containerName="init" Dec 16 15:15:33 crc kubenswrapper[4775]: E1216 15:15:33.853589 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="652d1ed1-db2e-48c7-8409-69047378a6fe" containerName="keystone-bootstrap" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.853595 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="652d1ed1-db2e-48c7-8409-69047378a6fe" containerName="keystone-bootstrap" Dec 16 15:15:33 crc kubenswrapper[4775]: E1216 15:15:33.853604 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96" containerName="collect-profiles" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.853610 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96" containerName="collect-profiles" Dec 16 15:15:33 crc kubenswrapper[4775]: E1216 15:15:33.853618 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0428ec83-659c-47e6-8b58-385b582e628e" containerName="dnsmasq-dns" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.853623 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0428ec83-659c-47e6-8b58-385b582e628e" containerName="dnsmasq-dns" Dec 16 15:15:33 crc kubenswrapper[4775]: E1216 15:15:33.853640 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0428ec83-659c-47e6-8b58-385b582e628e" containerName="init" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.853645 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0428ec83-659c-47e6-8b58-385b582e628e" containerName="init" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.853787 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6064a5c8-9b4f-44bc-8041-7d5972634060" containerName="init" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.853801 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="652d1ed1-db2e-48c7-8409-69047378a6fe" containerName="keystone-bootstrap" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.853808 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="4295fd10-669f-4d0f-81bd-8cf04c0d4704" containerName="init" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.853818 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0428ec83-659c-47e6-8b58-385b582e628e" containerName="dnsmasq-dns" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.853830 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96" containerName="collect-profiles" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.854430 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.857365 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.857506 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.857703 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.858647 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rsk8n" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.858787 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 15:15:33 crc kubenswrapper[4775]: I1216 15:15:33.876648 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vgjff"] Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.028095 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-config-data\") pod \"keystone-bootstrap-vgjff\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.028192 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-scripts\") pod \"keystone-bootstrap-vgjff\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.028548 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-credential-keys\") pod \"keystone-bootstrap-vgjff\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.028627 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-fernet-keys\") pod \"keystone-bootstrap-vgjff\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.028753 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-combined-ca-bundle\") pod \"keystone-bootstrap-vgjff\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.028800 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5qhp\" (UniqueName: \"kubernetes.io/projected/c437c729-4da8-4394-8863-d0e4c8e73de1-kube-api-access-p5qhp\") pod \"keystone-bootstrap-vgjff\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.130675 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-scripts\") pod \"keystone-bootstrap-vgjff\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.130778 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-credential-keys\") pod \"keystone-bootstrap-vgjff\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.130820 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-fernet-keys\") pod \"keystone-bootstrap-vgjff\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.130878 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-combined-ca-bundle\") pod \"keystone-bootstrap-vgjff\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.130927 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5qhp\" (UniqueName: \"kubernetes.io/projected/c437c729-4da8-4394-8863-d0e4c8e73de1-kube-api-access-p5qhp\") pod \"keystone-bootstrap-vgjff\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.130952 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-config-data\") pod \"keystone-bootstrap-vgjff\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.137027 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-fernet-keys\") pod \"keystone-bootstrap-vgjff\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.137841 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-combined-ca-bundle\") pod \"keystone-bootstrap-vgjff\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.139379 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-credential-keys\") pod \"keystone-bootstrap-vgjff\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.139485 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-config-data\") pod \"keystone-bootstrap-vgjff\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.152866 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-scripts\") pod \"keystone-bootstrap-vgjff\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.154643 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5qhp\" (UniqueName: \"kubernetes.io/projected/c437c729-4da8-4394-8863-d0e4c8e73de1-kube-api-access-p5qhp\") pod \"keystone-bootstrap-vgjff\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.173606 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:34 crc kubenswrapper[4775]: E1216 15:15:34.307790 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 16 15:15:34 crc kubenswrapper[4775]: E1216 15:15:34.308068 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-82dqt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-j4mx8_openstack(6aae2a99-cf8f-4bdc-a5a0-4d548dcde207): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:15:34 crc kubenswrapper[4775]: E1216 15:15:34.309733 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-j4mx8" podUID="6aae2a99-cf8f-4bdc-a5a0-4d548dcde207" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.352146 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.435352 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"483d64ac-d722-4667-b836-88cf464097e7\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.435419 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/483d64ac-d722-4667-b836-88cf464097e7-httpd-run\") pod \"483d64ac-d722-4667-b836-88cf464097e7\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.435458 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/483d64ac-d722-4667-b836-88cf464097e7-logs\") pod \"483d64ac-d722-4667-b836-88cf464097e7\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.435479 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483d64ac-d722-4667-b836-88cf464097e7-combined-ca-bundle\") pod \"483d64ac-d722-4667-b836-88cf464097e7\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.435507 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/483d64ac-d722-4667-b836-88cf464097e7-scripts\") pod \"483d64ac-d722-4667-b836-88cf464097e7\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.435549 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs5gg\" (UniqueName: \"kubernetes.io/projected/483d64ac-d722-4667-b836-88cf464097e7-kube-api-access-rs5gg\") pod \"483d64ac-d722-4667-b836-88cf464097e7\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.435758 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483d64ac-d722-4667-b836-88cf464097e7-config-data\") pod \"483d64ac-d722-4667-b836-88cf464097e7\" (UID: \"483d64ac-d722-4667-b836-88cf464097e7\") " Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.439957 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "483d64ac-d722-4667-b836-88cf464097e7" (UID: "483d64ac-d722-4667-b836-88cf464097e7"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.440303 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/483d64ac-d722-4667-b836-88cf464097e7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "483d64ac-d722-4667-b836-88cf464097e7" (UID: "483d64ac-d722-4667-b836-88cf464097e7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.443225 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/483d64ac-d722-4667-b836-88cf464097e7-kube-api-access-rs5gg" (OuterVolumeSpecName: "kube-api-access-rs5gg") pod "483d64ac-d722-4667-b836-88cf464097e7" (UID: "483d64ac-d722-4667-b836-88cf464097e7"). InnerVolumeSpecName "kube-api-access-rs5gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.445052 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/483d64ac-d722-4667-b836-88cf464097e7-scripts" (OuterVolumeSpecName: "scripts") pod "483d64ac-d722-4667-b836-88cf464097e7" (UID: "483d64ac-d722-4667-b836-88cf464097e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.446381 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/483d64ac-d722-4667-b836-88cf464097e7-logs" (OuterVolumeSpecName: "logs") pod "483d64ac-d722-4667-b836-88cf464097e7" (UID: "483d64ac-d722-4667-b836-88cf464097e7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.472935 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/483d64ac-d722-4667-b836-88cf464097e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "483d64ac-d722-4667-b836-88cf464097e7" (UID: "483d64ac-d722-4667-b836-88cf464097e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.495190 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/483d64ac-d722-4667-b836-88cf464097e7-config-data" (OuterVolumeSpecName: "config-data") pod "483d64ac-d722-4667-b836-88cf464097e7" (UID: "483d64ac-d722-4667-b836-88cf464097e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.538347 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483d64ac-d722-4667-b836-88cf464097e7-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.538425 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.538437 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/483d64ac-d722-4667-b836-88cf464097e7-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.538447 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/483d64ac-d722-4667-b836-88cf464097e7-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.538457 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483d64ac-d722-4667-b836-88cf464097e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.538469 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/483d64ac-d722-4667-b836-88cf464097e7-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.538478 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs5gg\" (UniqueName: \"kubernetes.io/projected/483d64ac-d722-4667-b836-88cf464097e7-kube-api-access-rs5gg\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.556863 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.556854 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"483d64ac-d722-4667-b836-88cf464097e7","Type":"ContainerDied","Data":"84eaa5d7fb53c1a567b83012f11e4ba5241e842339a8145994b3d8bf3e411d32"} Dec 16 15:15:34 crc kubenswrapper[4775]: E1216 15:15:34.558576 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-j4mx8" podUID="6aae2a99-cf8f-4bdc-a5a0-4d548dcde207" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.560039 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.606172 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.614827 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.643053 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.651139 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:15:34 crc kubenswrapper[4775]: E1216 15:15:34.651695 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="483d64ac-d722-4667-b836-88cf464097e7" containerName="glance-httpd" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.652067 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="483d64ac-d722-4667-b836-88cf464097e7" containerName="glance-httpd" Dec 16 15:15:34 crc kubenswrapper[4775]: E1216 15:15:34.652090 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="483d64ac-d722-4667-b836-88cf464097e7" containerName="glance-log" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.652099 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="483d64ac-d722-4667-b836-88cf464097e7" containerName="glance-log" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.652309 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="483d64ac-d722-4667-b836-88cf464097e7" containerName="glance-httpd" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.652329 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="483d64ac-d722-4667-b836-88cf464097e7" containerName="glance-log" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.653528 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.657032 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.661413 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.664603 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 15:15:34 crc kubenswrapper[4775]: E1216 15:15:34.686250 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Dec 16 15:15:34 crc kubenswrapper[4775]: E1216 15:15:34.686471 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mcp7h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-9j6xx_openstack(23611da1-3f26-42c4-bd23-36e0b04bdc24): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:15:34 crc kubenswrapper[4775]: E1216 15:15:34.688135 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-9j6xx" podUID="23611da1-3f26-42c4-bd23-36e0b04bdc24" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.694640 4775 scope.go:117] "RemoveContainer" containerID="e2fab779748b41d2d6bca28ee35caff1c948d4988b65a4308383bcd22a0a32a5" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.744862 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.744942 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a2a6a81-f246-4963-bfcc-40d974860cd4-logs\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.744971 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a2a6a81-f246-4963-bfcc-40d974860cd4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.745030 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-scripts\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.745054 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.745137 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tzgv\" (UniqueName: \"kubernetes.io/projected/0a2a6a81-f246-4963-bfcc-40d974860cd4-kube-api-access-5tzgv\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.745225 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-config-data\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.745482 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.848573 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.848723 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.848745 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a2a6a81-f246-4963-bfcc-40d974860cd4-logs\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.848780 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a2a6a81-f246-4963-bfcc-40d974860cd4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.848829 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-scripts\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.848864 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.848882 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tzgv\" (UniqueName: \"kubernetes.io/projected/0a2a6a81-f246-4963-bfcc-40d974860cd4-kube-api-access-5tzgv\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.848926 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-config-data\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.849459 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.849785 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a2a6a81-f246-4963-bfcc-40d974860cd4-logs\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.854365 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a2a6a81-f246-4963-bfcc-40d974860cd4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.856210 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-scripts\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.857107 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-config-data\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.859622 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.863771 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.871705 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tzgv\" (UniqueName: \"kubernetes.io/projected/0a2a6a81-f246-4963-bfcc-40d974860cd4-kube-api-access-5tzgv\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.901142 4775 scope.go:117] "RemoveContainer" containerID="f0d520922de754f4d49d688dffef94c2622c098799d6ef7fb54efb1d6f294eb9" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.914732 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:34 crc kubenswrapper[4775]: I1216 15:15:34.979871 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:15:35 crc kubenswrapper[4775]: I1216 15:15:35.000597 4775 scope.go:117] "RemoveContainer" containerID="6915dea9eb238f468a945cd90498da568a8acc83e22750fe1c8aee67b14c404d" Dec 16 15:15:35 crc kubenswrapper[4775]: I1216 15:15:35.197826 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vgjff"] Dec 16 15:15:36 crc kubenswrapper[4775]: I1216 15:15:35.393192 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="483d64ac-d722-4667-b836-88cf464097e7" path="/var/lib/kubelet/pods/483d64ac-d722-4667-b836-88cf464097e7/volumes" Dec 16 15:15:36 crc kubenswrapper[4775]: I1216 15:15:35.394119 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="652d1ed1-db2e-48c7-8409-69047378a6fe" path="/var/lib/kubelet/pods/652d1ed1-db2e-48c7-8409-69047378a6fe/volumes" Dec 16 15:15:36 crc kubenswrapper[4775]: I1216 15:15:35.584414 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vgjff" event={"ID":"c437c729-4da8-4394-8863-d0e4c8e73de1","Type":"ContainerStarted","Data":"7e7c1b8ea95822028a150f25f895cad73f855ecaafb58cefedfe9024eafa9e15"} Dec 16 15:15:36 crc kubenswrapper[4775]: I1216 15:15:35.587268 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b","Type":"ContainerStarted","Data":"9e73013142ca6f6ab9e7af148b0250deeda8b2986cc2a0da4fc90089684cbb7f"} Dec 16 15:15:36 crc kubenswrapper[4775]: I1216 15:15:35.598181 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerStarted","Data":"191baaa15580fc980936d4ebfb4d77ed829d99816880e694e3e02bd3ec00e6a9"} Dec 16 15:15:36 crc kubenswrapper[4775]: E1216 15:15:35.601947 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-9j6xx" podUID="23611da1-3f26-42c4-bd23-36e0b04bdc24" Dec 16 15:15:36 crc kubenswrapper[4775]: I1216 15:15:35.954540 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:15:36 crc kubenswrapper[4775]: I1216 15:15:36.611843 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7829c820-6674-4823-a409-2dad02cffe7b","Type":"ContainerStarted","Data":"dee480379df7d5b47227646e7ab12f025e7864848865937100bcbbf6bb34b08b"} Dec 16 15:15:36 crc kubenswrapper[4775]: I1216 15:15:36.612121 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7829c820-6674-4823-a409-2dad02cffe7b" containerName="glance-log" containerID="cri-o://fe794b90b9faf3f59c87057ca9919d556ad0cedd8e3a756fca5550c4d7873bd2" gracePeriod=30 Dec 16 15:15:36 crc kubenswrapper[4775]: I1216 15:15:36.612489 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7829c820-6674-4823-a409-2dad02cffe7b" containerName="glance-httpd" containerID="cri-o://dee480379df7d5b47227646e7ab12f025e7864848865937100bcbbf6bb34b08b" gracePeriod=30 Dec 16 15:15:36 crc kubenswrapper[4775]: I1216 15:15:36.615348 4775 generic.go:334] "Generic (PLEG): container finished" podID="63559e2e-2493-4b6f-b5d2-1573af2e9ac6" containerID="0bfc0a03cc9ec791094f058889bcec04dcf0c29ed44680afd7d2ee54622c2e0c" exitCode=0 Dec 16 15:15:36 crc kubenswrapper[4775]: I1216 15:15:36.615415 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5dkn9" event={"ID":"63559e2e-2493-4b6f-b5d2-1573af2e9ac6","Type":"ContainerDied","Data":"0bfc0a03cc9ec791094f058889bcec04dcf0c29ed44680afd7d2ee54622c2e0c"} Dec 16 15:15:36 crc kubenswrapper[4775]: I1216 15:15:36.616999 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vgjff" event={"ID":"c437c729-4da8-4394-8863-d0e4c8e73de1","Type":"ContainerStarted","Data":"ea3cdaaf6dcc755350d4442c8d87f386b372864221f0a0de2ffdad55737f6e7d"} Dec 16 15:15:36 crc kubenswrapper[4775]: I1216 15:15:36.628081 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cjjlj" event={"ID":"10d12e79-44d3-4b3a-bd17-af547a42fc19","Type":"ContainerStarted","Data":"e552ac1296830e5c14dc2da9c5fc2bb36bdc2766b7cf8e914ff749aacb7f0e1d"} Dec 16 15:15:36 crc kubenswrapper[4775]: I1216 15:15:36.631762 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0a2a6a81-f246-4963-bfcc-40d974860cd4","Type":"ContainerStarted","Data":"fe97ccdd2b524fba889177fa7b4be60d1aff4252728e422879dd3ed648a7e03f"} Dec 16 15:15:36 crc kubenswrapper[4775]: I1216 15:15:36.631802 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0a2a6a81-f246-4963-bfcc-40d974860cd4","Type":"ContainerStarted","Data":"ae2789a7181190ded83220910a176d533a2d2774be98ebfcbcf44a1cfdd326d4"} Dec 16 15:15:36 crc kubenswrapper[4775]: I1216 15:15:36.649738 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=35.64971346 podStartE2EDuration="35.64971346s" podCreationTimestamp="2025-12-16 15:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:36.644570298 +0000 UTC m=+1261.595649221" watchObservedRunningTime="2025-12-16 15:15:36.64971346 +0000 UTC m=+1261.600792373" Dec 16 15:15:36 crc kubenswrapper[4775]: I1216 15:15:36.667625 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-cjjlj" podStartSLOduration=3.65730967 podStartE2EDuration="35.667572054s" podCreationTimestamp="2025-12-16 15:15:01 +0000 UTC" firstStartedPulling="2025-12-16 15:15:02.906127831 +0000 UTC m=+1227.857206754" lastFinishedPulling="2025-12-16 15:15:34.916390215 +0000 UTC m=+1259.867469138" observedRunningTime="2025-12-16 15:15:36.661305416 +0000 UTC m=+1261.612384349" watchObservedRunningTime="2025-12-16 15:15:36.667572054 +0000 UTC m=+1261.618650977" Dec 16 15:15:36 crc kubenswrapper[4775]: I1216 15:15:36.722110 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vgjff" podStartSLOduration=3.722088904 podStartE2EDuration="3.722088904s" podCreationTimestamp="2025-12-16 15:15:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:36.697693724 +0000 UTC m=+1261.648772667" watchObservedRunningTime="2025-12-16 15:15:36.722088904 +0000 UTC m=+1261.673168087" Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.642916 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0a2a6a81-f246-4963-bfcc-40d974860cd4","Type":"ContainerStarted","Data":"e903035b3104b3f236a6ba889058883c4f7a80ef2bdebff8c661b0b0d43be75c"} Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.651151 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b","Type":"ContainerStarted","Data":"8ccd8ca819b4e266a04faad0a0625b9006c3cb7c757ece3b5742cd732f812992"} Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.656442 4775 generic.go:334] "Generic (PLEG): container finished" podID="7829c820-6674-4823-a409-2dad02cffe7b" containerID="dee480379df7d5b47227646e7ab12f025e7864848865937100bcbbf6bb34b08b" exitCode=0 Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.656487 4775 generic.go:334] "Generic (PLEG): container finished" podID="7829c820-6674-4823-a409-2dad02cffe7b" containerID="fe794b90b9faf3f59c87057ca9919d556ad0cedd8e3a756fca5550c4d7873bd2" exitCode=143 Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.656516 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7829c820-6674-4823-a409-2dad02cffe7b","Type":"ContainerDied","Data":"dee480379df7d5b47227646e7ab12f025e7864848865937100bcbbf6bb34b08b"} Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.656553 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7829c820-6674-4823-a409-2dad02cffe7b","Type":"ContainerDied","Data":"fe794b90b9faf3f59c87057ca9919d556ad0cedd8e3a756fca5550c4d7873bd2"} Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.656564 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7829c820-6674-4823-a409-2dad02cffe7b","Type":"ContainerDied","Data":"e417aa4abbef10cbd6c9af051b8b69fcea118cd0b264567cbb655e6520f44981"} Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.656573 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e417aa4abbef10cbd6c9af051b8b69fcea118cd0b264567cbb655e6520f44981" Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.665295 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.665271362 podStartE2EDuration="3.665271362s" podCreationTimestamp="2025-12-16 15:15:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:37.664505917 +0000 UTC m=+1262.615584860" watchObservedRunningTime="2025-12-16 15:15:37.665271362 +0000 UTC m=+1262.616350285" Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.707586 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.832214 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7829c820-6674-4823-a409-2dad02cffe7b-config-data\") pod \"7829c820-6674-4823-a409-2dad02cffe7b\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.832287 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"7829c820-6674-4823-a409-2dad02cffe7b\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.832356 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829c820-6674-4823-a409-2dad02cffe7b-combined-ca-bundle\") pod \"7829c820-6674-4823-a409-2dad02cffe7b\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.832400 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h5xn\" (UniqueName: \"kubernetes.io/projected/7829c820-6674-4823-a409-2dad02cffe7b-kube-api-access-7h5xn\") pod \"7829c820-6674-4823-a409-2dad02cffe7b\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.832441 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7829c820-6674-4823-a409-2dad02cffe7b-scripts\") pod \"7829c820-6674-4823-a409-2dad02cffe7b\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.832466 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7829c820-6674-4823-a409-2dad02cffe7b-httpd-run\") pod \"7829c820-6674-4823-a409-2dad02cffe7b\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.833029 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7829c820-6674-4823-a409-2dad02cffe7b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7829c820-6674-4823-a409-2dad02cffe7b" (UID: "7829c820-6674-4823-a409-2dad02cffe7b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.833161 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7829c820-6674-4823-a409-2dad02cffe7b-logs\") pod \"7829c820-6674-4823-a409-2dad02cffe7b\" (UID: \"7829c820-6674-4823-a409-2dad02cffe7b\") " Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.833508 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7829c820-6674-4823-a409-2dad02cffe7b-logs" (OuterVolumeSpecName: "logs") pod "7829c820-6674-4823-a409-2dad02cffe7b" (UID: "7829c820-6674-4823-a409-2dad02cffe7b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.833725 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7829c820-6674-4823-a409-2dad02cffe7b-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.833741 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7829c820-6674-4823-a409-2dad02cffe7b-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.839394 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "7829c820-6674-4823-a409-2dad02cffe7b" (UID: "7829c820-6674-4823-a409-2dad02cffe7b"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.839384 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7829c820-6674-4823-a409-2dad02cffe7b-scripts" (OuterVolumeSpecName: "scripts") pod "7829c820-6674-4823-a409-2dad02cffe7b" (UID: "7829c820-6674-4823-a409-2dad02cffe7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.845180 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7829c820-6674-4823-a409-2dad02cffe7b-kube-api-access-7h5xn" (OuterVolumeSpecName: "kube-api-access-7h5xn") pod "7829c820-6674-4823-a409-2dad02cffe7b" (UID: "7829c820-6674-4823-a409-2dad02cffe7b"). InnerVolumeSpecName "kube-api-access-7h5xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.922769 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7829c820-6674-4823-a409-2dad02cffe7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7829c820-6674-4823-a409-2dad02cffe7b" (UID: "7829c820-6674-4823-a409-2dad02cffe7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.935332 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.935366 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829c820-6674-4823-a409-2dad02cffe7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.935379 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h5xn\" (UniqueName: \"kubernetes.io/projected/7829c820-6674-4823-a409-2dad02cffe7b-kube-api-access-7h5xn\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.935389 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7829c820-6674-4823-a409-2dad02cffe7b-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.940204 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7829c820-6674-4823-a409-2dad02cffe7b-config-data" (OuterVolumeSpecName: "config-data") pod "7829c820-6674-4823-a409-2dad02cffe7b" (UID: "7829c820-6674-4823-a409-2dad02cffe7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.959544 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 16 15:15:37 crc kubenswrapper[4775]: I1216 15:15:37.968801 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5dkn9" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.036224 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/63559e2e-2493-4b6f-b5d2-1573af2e9ac6-config\") pod \"63559e2e-2493-4b6f-b5d2-1573af2e9ac6\" (UID: \"63559e2e-2493-4b6f-b5d2-1573af2e9ac6\") " Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.036363 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63559e2e-2493-4b6f-b5d2-1573af2e9ac6-combined-ca-bundle\") pod \"63559e2e-2493-4b6f-b5d2-1573af2e9ac6\" (UID: \"63559e2e-2493-4b6f-b5d2-1573af2e9ac6\") " Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.036459 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bffgg\" (UniqueName: \"kubernetes.io/projected/63559e2e-2493-4b6f-b5d2-1573af2e9ac6-kube-api-access-bffgg\") pod \"63559e2e-2493-4b6f-b5d2-1573af2e9ac6\" (UID: \"63559e2e-2493-4b6f-b5d2-1573af2e9ac6\") " Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.036808 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7829c820-6674-4823-a409-2dad02cffe7b-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.036828 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.040412 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63559e2e-2493-4b6f-b5d2-1573af2e9ac6-kube-api-access-bffgg" (OuterVolumeSpecName: "kube-api-access-bffgg") pod "63559e2e-2493-4b6f-b5d2-1573af2e9ac6" (UID: "63559e2e-2493-4b6f-b5d2-1573af2e9ac6"). InnerVolumeSpecName "kube-api-access-bffgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.060993 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63559e2e-2493-4b6f-b5d2-1573af2e9ac6-config" (OuterVolumeSpecName: "config") pod "63559e2e-2493-4b6f-b5d2-1573af2e9ac6" (UID: "63559e2e-2493-4b6f-b5d2-1573af2e9ac6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.061780 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63559e2e-2493-4b6f-b5d2-1573af2e9ac6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63559e2e-2493-4b6f-b5d2-1573af2e9ac6" (UID: "63559e2e-2493-4b6f-b5d2-1573af2e9ac6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.139073 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bffgg\" (UniqueName: \"kubernetes.io/projected/63559e2e-2493-4b6f-b5d2-1573af2e9ac6-kube-api-access-bffgg\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.139117 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/63559e2e-2493-4b6f-b5d2-1573af2e9ac6-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.139154 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63559e2e-2493-4b6f-b5d2-1573af2e9ac6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.667598 4775 generic.go:334] "Generic (PLEG): container finished" podID="10d12e79-44d3-4b3a-bd17-af547a42fc19" containerID="e552ac1296830e5c14dc2da9c5fc2bb36bdc2766b7cf8e914ff749aacb7f0e1d" exitCode=0 Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.667662 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cjjlj" event={"ID":"10d12e79-44d3-4b3a-bd17-af547a42fc19","Type":"ContainerDied","Data":"e552ac1296830e5c14dc2da9c5fc2bb36bdc2766b7cf8e914ff749aacb7f0e1d"} Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.670951 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.672707 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5dkn9" event={"ID":"63559e2e-2493-4b6f-b5d2-1573af2e9ac6","Type":"ContainerDied","Data":"07ddf9d7a409d0ba3a4860c11b501cae8468e05409119273ebc937e7a12d9491"} Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.672777 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07ddf9d7a409d0ba3a4860c11b501cae8468e05409119273ebc937e7a12d9491" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.672791 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5dkn9" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.717431 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.728055 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.751714 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:15:38 crc kubenswrapper[4775]: E1216 15:15:38.752317 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7829c820-6674-4823-a409-2dad02cffe7b" containerName="glance-log" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.752340 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7829c820-6674-4823-a409-2dad02cffe7b" containerName="glance-log" Dec 16 15:15:38 crc kubenswrapper[4775]: E1216 15:15:38.752368 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63559e2e-2493-4b6f-b5d2-1573af2e9ac6" containerName="neutron-db-sync" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.752378 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="63559e2e-2493-4b6f-b5d2-1573af2e9ac6" containerName="neutron-db-sync" Dec 16 15:15:38 crc kubenswrapper[4775]: E1216 15:15:38.752394 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7829c820-6674-4823-a409-2dad02cffe7b" containerName="glance-httpd" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.752403 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7829c820-6674-4823-a409-2dad02cffe7b" containerName="glance-httpd" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.752635 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="63559e2e-2493-4b6f-b5d2-1573af2e9ac6" containerName="neutron-db-sync" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.752662 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7829c820-6674-4823-a409-2dad02cffe7b" containerName="glance-log" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.752681 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7829c820-6674-4823-a409-2dad02cffe7b" containerName="glance-httpd" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.753673 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.760260 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.760765 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.770519 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.851446 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.851615 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.851684 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.861914 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cae742d1-aa2b-4462-a2ca-e3b73d58d564-logs\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.862213 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cae742d1-aa2b-4462-a2ca-e3b73d58d564-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.862343 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.862552 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.862600 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpzmr\" (UniqueName: \"kubernetes.io/projected/cae742d1-aa2b-4462-a2ca-e3b73d58d564-kube-api-access-fpzmr\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.965438 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.965495 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.965523 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpzmr\" (UniqueName: \"kubernetes.io/projected/cae742d1-aa2b-4462-a2ca-e3b73d58d564-kube-api-access-fpzmr\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.965578 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.965648 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.965684 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.965742 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cae742d1-aa2b-4462-a2ca-e3b73d58d564-logs\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.965799 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cae742d1-aa2b-4462-a2ca-e3b73d58d564-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.966544 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cae742d1-aa2b-4462-a2ca-e3b73d58d564-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.966849 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.977419 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.978131 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.989363 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cae742d1-aa2b-4462-a2ca-e3b73d58d564-logs\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:38 crc kubenswrapper[4775]: I1216 15:15:38.991805 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.022907 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.028861 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpzmr\" (UniqueName: \"kubernetes.io/projected/cae742d1-aa2b-4462-a2ca-e3b73d58d564-kube-api-access-fpzmr\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.098185 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.168293 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t22wn"] Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.170263 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.215728 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t22wn"] Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.272195 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-t22wn\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.272297 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-config\") pod \"dnsmasq-dns-55f844cf75-t22wn\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.272362 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-t22wn\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.272391 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hr84\" (UniqueName: \"kubernetes.io/projected/864f3a02-d697-4a42-b8fc-2aafd912bc62-kube-api-access-5hr84\") pod \"dnsmasq-dns-55f844cf75-t22wn\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.272433 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-dns-svc\") pod \"dnsmasq-dns-55f844cf75-t22wn\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.272457 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-t22wn\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.369286 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7829c820-6674-4823-a409-2dad02cffe7b" path="/var/lib/kubelet/pods/7829c820-6674-4823-a409-2dad02cffe7b/volumes" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.374202 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-t22wn\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.374441 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hr84\" (UniqueName: \"kubernetes.io/projected/864f3a02-d697-4a42-b8fc-2aafd912bc62-kube-api-access-5hr84\") pod \"dnsmasq-dns-55f844cf75-t22wn\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.374544 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-dns-svc\") pod \"dnsmasq-dns-55f844cf75-t22wn\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.374646 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-t22wn\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.375571 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-t22wn\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.375810 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-config\") pod \"dnsmasq-dns-55f844cf75-t22wn\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.376151 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-t22wn\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.375412 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-t22wn\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.376929 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-t22wn\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.377761 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-dns-svc\") pod \"dnsmasq-dns-55f844cf75-t22wn\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.379185 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-69d967f7b4-xcrpt"] Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.381323 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.382615 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.383001 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-config\") pod \"dnsmasq-dns-55f844cf75-t22wn\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.393490 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-f4gmr" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.393842 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.394148 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.394561 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.399289 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69d967f7b4-xcrpt"] Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.406982 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hr84\" (UniqueName: \"kubernetes.io/projected/864f3a02-d697-4a42-b8fc-2aafd912bc62-kube-api-access-5hr84\") pod \"dnsmasq-dns-55f844cf75-t22wn\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.488062 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-config\") pod \"neutron-69d967f7b4-xcrpt\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.488474 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-httpd-config\") pod \"neutron-69d967f7b4-xcrpt\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.488569 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vf5c\" (UniqueName: \"kubernetes.io/projected/62ce6699-2bf8-4133-ae72-6d91903df144-kube-api-access-8vf5c\") pod \"neutron-69d967f7b4-xcrpt\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.488622 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-combined-ca-bundle\") pod \"neutron-69d967f7b4-xcrpt\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.488674 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-ovndb-tls-certs\") pod \"neutron-69d967f7b4-xcrpt\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.597710 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-httpd-config\") pod \"neutron-69d967f7b4-xcrpt\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.598490 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vf5c\" (UniqueName: \"kubernetes.io/projected/62ce6699-2bf8-4133-ae72-6d91903df144-kube-api-access-8vf5c\") pod \"neutron-69d967f7b4-xcrpt\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.598627 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-combined-ca-bundle\") pod \"neutron-69d967f7b4-xcrpt\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.598690 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-ovndb-tls-certs\") pod \"neutron-69d967f7b4-xcrpt\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.598781 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-config\") pod \"neutron-69d967f7b4-xcrpt\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.606095 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-config\") pod \"neutron-69d967f7b4-xcrpt\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.606665 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-ovndb-tls-certs\") pod \"neutron-69d967f7b4-xcrpt\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.606721 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-httpd-config\") pod \"neutron-69d967f7b4-xcrpt\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.623342 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vf5c\" (UniqueName: \"kubernetes.io/projected/62ce6699-2bf8-4133-ae72-6d91903df144-kube-api-access-8vf5c\") pod \"neutron-69d967f7b4-xcrpt\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.633610 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-combined-ca-bundle\") pod \"neutron-69d967f7b4-xcrpt\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.673694 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.732803 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vgjff" event={"ID":"c437c729-4da8-4394-8863-d0e4c8e73de1","Type":"ContainerDied","Data":"ea3cdaaf6dcc755350d4442c8d87f386b372864221f0a0de2ffdad55737f6e7d"} Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.732671 4775 generic.go:334] "Generic (PLEG): container finished" podID="c437c729-4da8-4394-8863-d0e4c8e73de1" containerID="ea3cdaaf6dcc755350d4442c8d87f386b372864221f0a0de2ffdad55737f6e7d" exitCode=0 Dec 16 15:15:39 crc kubenswrapper[4775]: I1216 15:15:39.836067 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.135295 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.149231 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:15:40 crc kubenswrapper[4775]: W1216 15:15:40.177059 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcae742d1_aa2b_4462_a2ca_e3b73d58d564.slice/crio-728fa0f2c920d936875865881d767b8119712f216c11434ac8170a432742f78c WatchSource:0}: Error finding container 728fa0f2c920d936875865881d767b8119712f216c11434ac8170a432742f78c: Status 404 returned error can't find the container with id 728fa0f2c920d936875865881d767b8119712f216c11434ac8170a432742f78c Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.227613 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10d12e79-44d3-4b3a-bd17-af547a42fc19-logs\") pod \"10d12e79-44d3-4b3a-bd17-af547a42fc19\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.227698 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srtdk\" (UniqueName: \"kubernetes.io/projected/10d12e79-44d3-4b3a-bd17-af547a42fc19-kube-api-access-srtdk\") pod \"10d12e79-44d3-4b3a-bd17-af547a42fc19\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.227853 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d12e79-44d3-4b3a-bd17-af547a42fc19-config-data\") pod \"10d12e79-44d3-4b3a-bd17-af547a42fc19\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.228000 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d12e79-44d3-4b3a-bd17-af547a42fc19-combined-ca-bundle\") pod \"10d12e79-44d3-4b3a-bd17-af547a42fc19\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.228038 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d12e79-44d3-4b3a-bd17-af547a42fc19-scripts\") pod \"10d12e79-44d3-4b3a-bd17-af547a42fc19\" (UID: \"10d12e79-44d3-4b3a-bd17-af547a42fc19\") " Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.228201 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10d12e79-44d3-4b3a-bd17-af547a42fc19-logs" (OuterVolumeSpecName: "logs") pod "10d12e79-44d3-4b3a-bd17-af547a42fc19" (UID: "10d12e79-44d3-4b3a-bd17-af547a42fc19"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.228781 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10d12e79-44d3-4b3a-bd17-af547a42fc19-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.238328 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10d12e79-44d3-4b3a-bd17-af547a42fc19-kube-api-access-srtdk" (OuterVolumeSpecName: "kube-api-access-srtdk") pod "10d12e79-44d3-4b3a-bd17-af547a42fc19" (UID: "10d12e79-44d3-4b3a-bd17-af547a42fc19"). InnerVolumeSpecName "kube-api-access-srtdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.239171 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d12e79-44d3-4b3a-bd17-af547a42fc19-scripts" (OuterVolumeSpecName: "scripts") pod "10d12e79-44d3-4b3a-bd17-af547a42fc19" (UID: "10d12e79-44d3-4b3a-bd17-af547a42fc19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.265343 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d12e79-44d3-4b3a-bd17-af547a42fc19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10d12e79-44d3-4b3a-bd17-af547a42fc19" (UID: "10d12e79-44d3-4b3a-bd17-af547a42fc19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.273875 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d12e79-44d3-4b3a-bd17-af547a42fc19-config-data" (OuterVolumeSpecName: "config-data") pod "10d12e79-44d3-4b3a-bd17-af547a42fc19" (UID: "10d12e79-44d3-4b3a-bd17-af547a42fc19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.330024 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d12e79-44d3-4b3a-bd17-af547a42fc19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.330061 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d12e79-44d3-4b3a-bd17-af547a42fc19-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.330074 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srtdk\" (UniqueName: \"kubernetes.io/projected/10d12e79-44d3-4b3a-bd17-af547a42fc19-kube-api-access-srtdk\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.330087 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d12e79-44d3-4b3a-bd17-af547a42fc19-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.353137 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t22wn"] Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.572225 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69d967f7b4-xcrpt"] Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.747900 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d967f7b4-xcrpt" event={"ID":"62ce6699-2bf8-4133-ae72-6d91903df144","Type":"ContainerStarted","Data":"a45e1f82d4328081e0ceefa89bd62f300f020a6f07d27965485858b80447d6fe"} Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.753344 4775 generic.go:334] "Generic (PLEG): container finished" podID="864f3a02-d697-4a42-b8fc-2aafd912bc62" containerID="ef0b12ddd3a5b7c0b070733b0d227a77dca5c57ab14c432c934bba2ad989d93f" exitCode=0 Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.753454 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t22wn" event={"ID":"864f3a02-d697-4a42-b8fc-2aafd912bc62","Type":"ContainerDied","Data":"ef0b12ddd3a5b7c0b070733b0d227a77dca5c57ab14c432c934bba2ad989d93f"} Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.753516 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t22wn" event={"ID":"864f3a02-d697-4a42-b8fc-2aafd912bc62","Type":"ContainerStarted","Data":"cf12d723ec36a01fe624e42171f675c696a13e5573ade9e5f6b3d86ea333ce43"} Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.760087 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cae742d1-aa2b-4462-a2ca-e3b73d58d564","Type":"ContainerStarted","Data":"728fa0f2c920d936875865881d767b8119712f216c11434ac8170a432742f78c"} Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.787749 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cjjlj" event={"ID":"10d12e79-44d3-4b3a-bd17-af547a42fc19","Type":"ContainerDied","Data":"165f3200ee8c91fec26023dc84a8a167d3257687b5141cd508f126501cbdabd2"} Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.787819 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="165f3200ee8c91fec26023dc84a8a167d3257687b5141cd508f126501cbdabd2" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.787872 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cjjlj" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.840979 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c66959c54-2xm6x"] Dec 16 15:15:40 crc kubenswrapper[4775]: E1216 15:15:40.841418 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d12e79-44d3-4b3a-bd17-af547a42fc19" containerName="placement-db-sync" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.841439 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d12e79-44d3-4b3a-bd17-af547a42fc19" containerName="placement-db-sync" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.841619 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d12e79-44d3-4b3a-bd17-af547a42fc19" containerName="placement-db-sync" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.842554 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.851821 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c66959c54-2xm6x"] Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.855805 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.856035 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.856151 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6n6mx" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.856261 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.856797 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.944299 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1a21b6b-9081-4060-8bc8-566c2a60bde6-combined-ca-bundle\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.944383 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bwxx\" (UniqueName: \"kubernetes.io/projected/c1a21b6b-9081-4060-8bc8-566c2a60bde6-kube-api-access-2bwxx\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.944410 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1a21b6b-9081-4060-8bc8-566c2a60bde6-public-tls-certs\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.944469 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1a21b6b-9081-4060-8bc8-566c2a60bde6-logs\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.944511 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1a21b6b-9081-4060-8bc8-566c2a60bde6-config-data\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.944528 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1a21b6b-9081-4060-8bc8-566c2a60bde6-scripts\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:40 crc kubenswrapper[4775]: I1216 15:15:40.944551 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1a21b6b-9081-4060-8bc8-566c2a60bde6-internal-tls-certs\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.048741 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1a21b6b-9081-4060-8bc8-566c2a60bde6-public-tls-certs\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.048910 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1a21b6b-9081-4060-8bc8-566c2a60bde6-logs\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.048996 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1a21b6b-9081-4060-8bc8-566c2a60bde6-config-data\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.049024 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1a21b6b-9081-4060-8bc8-566c2a60bde6-scripts\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.049072 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1a21b6b-9081-4060-8bc8-566c2a60bde6-internal-tls-certs\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.049130 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1a21b6b-9081-4060-8bc8-566c2a60bde6-combined-ca-bundle\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.049228 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bwxx\" (UniqueName: \"kubernetes.io/projected/c1a21b6b-9081-4060-8bc8-566c2a60bde6-kube-api-access-2bwxx\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.049552 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1a21b6b-9081-4060-8bc8-566c2a60bde6-logs\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.066335 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1a21b6b-9081-4060-8bc8-566c2a60bde6-internal-tls-certs\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.068725 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1a21b6b-9081-4060-8bc8-566c2a60bde6-scripts\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.073819 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1a21b6b-9081-4060-8bc8-566c2a60bde6-combined-ca-bundle\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.075038 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1a21b6b-9081-4060-8bc8-566c2a60bde6-public-tls-certs\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.075718 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1a21b6b-9081-4060-8bc8-566c2a60bde6-config-data\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.104079 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bwxx\" (UniqueName: \"kubernetes.io/projected/c1a21b6b-9081-4060-8bc8-566c2a60bde6-kube-api-access-2bwxx\") pod \"placement-c66959c54-2xm6x\" (UID: \"c1a21b6b-9081-4060-8bc8-566c2a60bde6\") " pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.217520 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.371039 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.455869 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-config-data\") pod \"c437c729-4da8-4394-8863-d0e4c8e73de1\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.456285 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-fernet-keys\") pod \"c437c729-4da8-4394-8863-d0e4c8e73de1\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.456328 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-combined-ca-bundle\") pod \"c437c729-4da8-4394-8863-d0e4c8e73de1\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.456372 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5qhp\" (UniqueName: \"kubernetes.io/projected/c437c729-4da8-4394-8863-d0e4c8e73de1-kube-api-access-p5qhp\") pod \"c437c729-4da8-4394-8863-d0e4c8e73de1\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.456436 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-credential-keys\") pod \"c437c729-4da8-4394-8863-d0e4c8e73de1\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.456521 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-scripts\") pod \"c437c729-4da8-4394-8863-d0e4c8e73de1\" (UID: \"c437c729-4da8-4394-8863-d0e4c8e73de1\") " Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.476074 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c437c729-4da8-4394-8863-d0e4c8e73de1-kube-api-access-p5qhp" (OuterVolumeSpecName: "kube-api-access-p5qhp") pod "c437c729-4da8-4394-8863-d0e4c8e73de1" (UID: "c437c729-4da8-4394-8863-d0e4c8e73de1"). InnerVolumeSpecName "kube-api-access-p5qhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.477030 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c437c729-4da8-4394-8863-d0e4c8e73de1" (UID: "c437c729-4da8-4394-8863-d0e4c8e73de1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.477141 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-scripts" (OuterVolumeSpecName: "scripts") pod "c437c729-4da8-4394-8863-d0e4c8e73de1" (UID: "c437c729-4da8-4394-8863-d0e4c8e73de1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.507582 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c437c729-4da8-4394-8863-d0e4c8e73de1" (UID: "c437c729-4da8-4394-8863-d0e4c8e73de1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.525997 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-config-data" (OuterVolumeSpecName: "config-data") pod "c437c729-4da8-4394-8863-d0e4c8e73de1" (UID: "c437c729-4da8-4394-8863-d0e4c8e73de1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.532786 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c437c729-4da8-4394-8863-d0e4c8e73de1" (UID: "c437c729-4da8-4394-8863-d0e4c8e73de1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.558522 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.558558 4775 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.558571 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.558583 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5qhp\" (UniqueName: \"kubernetes.io/projected/c437c729-4da8-4394-8863-d0e4c8e73de1-kube-api-access-p5qhp\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.558593 4775 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.558601 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c437c729-4da8-4394-8863-d0e4c8e73de1-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.827321 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cae742d1-aa2b-4462-a2ca-e3b73d58d564","Type":"ContainerStarted","Data":"2ba717144947e6116cbb8342108d99471e1beb3f36cdefc8beba61b3c7ecf227"} Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.831244 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vgjff" event={"ID":"c437c729-4da8-4394-8863-d0e4c8e73de1","Type":"ContainerDied","Data":"7e7c1b8ea95822028a150f25f895cad73f855ecaafb58cefedfe9024eafa9e15"} Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.831524 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e7c1b8ea95822028a150f25f895cad73f855ecaafb58cefedfe9024eafa9e15" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.831610 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vgjff" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.835169 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d967f7b4-xcrpt" event={"ID":"62ce6699-2bf8-4133-ae72-6d91903df144","Type":"ContainerStarted","Data":"34cd39d5a48688c95c58599921f90d7909125613d4905cb3cf4ec80967141470"} Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.835196 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d967f7b4-xcrpt" event={"ID":"62ce6699-2bf8-4133-ae72-6d91903df144","Type":"ContainerStarted","Data":"168eba86b02448067af48492642ba60e548ea855e67f9cc16dc59fcc0259b829"} Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.836006 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.850198 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t22wn" event={"ID":"864f3a02-d697-4a42-b8fc-2aafd912bc62","Type":"ContainerStarted","Data":"8ea4ab6cce773d83c4f4ad66fa77cb5adb6b1bebe94e845f00036470cfaa6533"} Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.851080 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.858427 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c66959c54-2xm6x"] Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.924770 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-69d967f7b4-xcrpt" podStartSLOduration=2.924749786 podStartE2EDuration="2.924749786s" podCreationTimestamp="2025-12-16 15:15:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:41.883263747 +0000 UTC m=+1266.834342700" watchObservedRunningTime="2025-12-16 15:15:41.924749786 +0000 UTC m=+1266.875828709" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.925190 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-t22wn" podStartSLOduration=2.925185589 podStartE2EDuration="2.925185589s" podCreationTimestamp="2025-12-16 15:15:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:41.92233903 +0000 UTC m=+1266.873417963" watchObservedRunningTime="2025-12-16 15:15:41.925185589 +0000 UTC m=+1266.876264512" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.986718 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-779ff79b57-nb7bt"] Dec 16 15:15:41 crc kubenswrapper[4775]: E1216 15:15:41.988730 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c437c729-4da8-4394-8863-d0e4c8e73de1" containerName="keystone-bootstrap" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.996226 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c437c729-4da8-4394-8863-d0e4c8e73de1" containerName="keystone-bootstrap" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.996633 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c437c729-4da8-4394-8863-d0e4c8e73de1" containerName="keystone-bootstrap" Dec 16 15:15:41 crc kubenswrapper[4775]: I1216 15:15:41.997919 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.005532 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.005537 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.005683 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.005721 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rsk8n" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.005771 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.006467 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.069565 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-779ff79b57-nb7bt"] Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.077731 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-fernet-keys\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.077818 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-public-tls-certs\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.077850 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-config-data\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.077877 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnktc\" (UniqueName: \"kubernetes.io/projected/2e88e1b8-8837-49a6-9769-ddab7adfb812-kube-api-access-dnktc\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.077921 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-combined-ca-bundle\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.077962 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-credential-keys\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.078014 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-scripts\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.078035 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-internal-tls-certs\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.185034 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-combined-ca-bundle\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.185527 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-credential-keys\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.185694 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-scripts\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.185801 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-internal-tls-certs\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.185979 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-fernet-keys\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.186111 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-public-tls-certs\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.186196 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-config-data\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.186273 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnktc\" (UniqueName: \"kubernetes.io/projected/2e88e1b8-8837-49a6-9769-ddab7adfb812-kube-api-access-dnktc\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.193307 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-internal-tls-certs\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.196129 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-combined-ca-bundle\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.203520 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-credential-keys\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.203741 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-fernet-keys\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.208714 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-config-data\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.209490 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-public-tls-certs\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.210309 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnktc\" (UniqueName: \"kubernetes.io/projected/2e88e1b8-8837-49a6-9769-ddab7adfb812-kube-api-access-dnktc\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.217207 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e88e1b8-8837-49a6-9769-ddab7adfb812-scripts\") pod \"keystone-779ff79b57-nb7bt\" (UID: \"2e88e1b8-8837-49a6-9769-ddab7adfb812\") " pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.369830 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.428707 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-69cbb5df9f-wmvhj"] Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.431315 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.433292 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.435355 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.443692 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69cbb5df9f-wmvhj"] Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.597955 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/66ed3d02-9c61-43a8-90bf-35d00458d088-httpd-config\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.598082 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ed3d02-9c61-43a8-90bf-35d00458d088-combined-ca-bundle\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.598107 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ed3d02-9c61-43a8-90bf-35d00458d088-ovndb-tls-certs\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.598153 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ed3d02-9c61-43a8-90bf-35d00458d088-internal-tls-certs\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.598285 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmkxj\" (UniqueName: \"kubernetes.io/projected/66ed3d02-9c61-43a8-90bf-35d00458d088-kube-api-access-nmkxj\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.598391 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ed3d02-9c61-43a8-90bf-35d00458d088-public-tls-certs\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.598477 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/66ed3d02-9c61-43a8-90bf-35d00458d088-config\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.700192 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ed3d02-9c61-43a8-90bf-35d00458d088-internal-tls-certs\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.700271 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmkxj\" (UniqueName: \"kubernetes.io/projected/66ed3d02-9c61-43a8-90bf-35d00458d088-kube-api-access-nmkxj\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.700310 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ed3d02-9c61-43a8-90bf-35d00458d088-public-tls-certs\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.700355 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/66ed3d02-9c61-43a8-90bf-35d00458d088-config\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.700438 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/66ed3d02-9c61-43a8-90bf-35d00458d088-httpd-config\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.700503 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ed3d02-9c61-43a8-90bf-35d00458d088-combined-ca-bundle\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.700529 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ed3d02-9c61-43a8-90bf-35d00458d088-ovndb-tls-certs\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.706919 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ed3d02-9c61-43a8-90bf-35d00458d088-internal-tls-certs\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.708980 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ed3d02-9c61-43a8-90bf-35d00458d088-public-tls-certs\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.710204 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ed3d02-9c61-43a8-90bf-35d00458d088-ovndb-tls-certs\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.711172 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/66ed3d02-9c61-43a8-90bf-35d00458d088-config\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.712003 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/66ed3d02-9c61-43a8-90bf-35d00458d088-httpd-config\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.719082 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ed3d02-9c61-43a8-90bf-35d00458d088-combined-ca-bundle\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.719851 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmkxj\" (UniqueName: \"kubernetes.io/projected/66ed3d02-9c61-43a8-90bf-35d00458d088-kube-api-access-nmkxj\") pod \"neutron-69cbb5df9f-wmvhj\" (UID: \"66ed3d02-9c61-43a8-90bf-35d00458d088\") " pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.811303 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.871160 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cae742d1-aa2b-4462-a2ca-e3b73d58d564","Type":"ContainerStarted","Data":"298cb301eb8afbe6b4e5205af068322e1c3698f9b518c92ec1229c8568754020"} Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.874725 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c66959c54-2xm6x" event={"ID":"c1a21b6b-9081-4060-8bc8-566c2a60bde6","Type":"ContainerStarted","Data":"c5ef843604372792d71964f8536ef9246db94e94028cc5fcf6c24fdec172d784"} Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.874758 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c66959c54-2xm6x" event={"ID":"c1a21b6b-9081-4060-8bc8-566c2a60bde6","Type":"ContainerStarted","Data":"e0c0dadb0175e926e51f692498a84681ce01fbfd741eacde10bba24d72d6809c"} Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.940907 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.940860493 podStartE2EDuration="4.940860493s" podCreationTimestamp="2025-12-16 15:15:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:42.900105238 +0000 UTC m=+1267.851184181" watchObservedRunningTime="2025-12-16 15:15:42.940860493 +0000 UTC m=+1267.891939416" Dec 16 15:15:42 crc kubenswrapper[4775]: I1216 15:15:42.953405 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-779ff79b57-nb7bt"] Dec 16 15:15:43 crc kubenswrapper[4775]: I1216 15:15:43.553779 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69cbb5df9f-wmvhj"] Dec 16 15:15:43 crc kubenswrapper[4775]: I1216 15:15:43.895972 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69cbb5df9f-wmvhj" event={"ID":"66ed3d02-9c61-43a8-90bf-35d00458d088","Type":"ContainerStarted","Data":"2b7770a28009d9645121716459a4c8c2f5bad45af20ee6e57beb7122ab491182"} Dec 16 15:15:43 crc kubenswrapper[4775]: I1216 15:15:43.900452 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-779ff79b57-nb7bt" event={"ID":"2e88e1b8-8837-49a6-9769-ddab7adfb812","Type":"ContainerStarted","Data":"196ae3527e4d56a78a8345037f0c4d8ee7255274648de1266fd22b3baf0b0a17"} Dec 16 15:15:43 crc kubenswrapper[4775]: I1216 15:15:43.900537 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-779ff79b57-nb7bt" event={"ID":"2e88e1b8-8837-49a6-9769-ddab7adfb812","Type":"ContainerStarted","Data":"24115d5b08c8c66e3f37080b06c0c4d54143785b3c9f6d5d59b25fb44f51394d"} Dec 16 15:15:44 crc kubenswrapper[4775]: I1216 15:15:44.957554 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69cbb5df9f-wmvhj" event={"ID":"66ed3d02-9c61-43a8-90bf-35d00458d088","Type":"ContainerStarted","Data":"29e52087a866239ec62863a45c8e7639f9f1dac64fbc5fb0dccd2b349bdcf2c4"} Dec 16 15:15:44 crc kubenswrapper[4775]: I1216 15:15:44.979646 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c66959c54-2xm6x" event={"ID":"c1a21b6b-9081-4060-8bc8-566c2a60bde6","Type":"ContainerStarted","Data":"fc91d4a600fae5caa94bd01c2765e1088cbb2534b6ec843b50df88165a868a46"} Dec 16 15:15:44 crc kubenswrapper[4775]: I1216 15:15:44.979690 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:44 crc kubenswrapper[4775]: I1216 15:15:44.979715 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:15:44 crc kubenswrapper[4775]: I1216 15:15:44.979732 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:15:44 crc kubenswrapper[4775]: I1216 15:15:44.980839 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 15:15:44 crc kubenswrapper[4775]: I1216 15:15:44.980882 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 15:15:45 crc kubenswrapper[4775]: I1216 15:15:45.026142 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-779ff79b57-nb7bt" podStartSLOduration=4.026114131 podStartE2EDuration="4.026114131s" podCreationTimestamp="2025-12-16 15:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:45.016399915 +0000 UTC m=+1269.967478848" watchObservedRunningTime="2025-12-16 15:15:45.026114131 +0000 UTC m=+1269.977193054" Dec 16 15:15:45 crc kubenswrapper[4775]: I1216 15:15:45.050830 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-c66959c54-2xm6x" podStartSLOduration=5.05080743 podStartE2EDuration="5.05080743s" podCreationTimestamp="2025-12-16 15:15:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:45.047088343 +0000 UTC m=+1269.998167286" watchObservedRunningTime="2025-12-16 15:15:45.05080743 +0000 UTC m=+1270.001886353" Dec 16 15:15:45 crc kubenswrapper[4775]: I1216 15:15:45.057276 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 15:15:45 crc kubenswrapper[4775]: I1216 15:15:45.060382 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 15:15:45 crc kubenswrapper[4775]: I1216 15:15:45.987225 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 15:15:45 crc kubenswrapper[4775]: I1216 15:15:45.987542 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 15:15:48 crc kubenswrapper[4775]: I1216 15:15:48.101838 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 15:15:48 crc kubenswrapper[4775]: I1216 15:15:48.102583 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 15:15:48 crc kubenswrapper[4775]: I1216 15:15:48.178508 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 15:15:49 crc kubenswrapper[4775]: I1216 15:15:49.383292 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:49 crc kubenswrapper[4775]: I1216 15:15:49.383756 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:49 crc kubenswrapper[4775]: I1216 15:15:49.452357 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:49 crc kubenswrapper[4775]: I1216 15:15:49.462057 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:49 crc kubenswrapper[4775]: I1216 15:15:49.676963 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:15:49 crc kubenswrapper[4775]: I1216 15:15:49.764338 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-nl6bm"] Dec 16 15:15:49 crc kubenswrapper[4775]: I1216 15:15:49.764603 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" podUID="21b4aaed-0663-4480-8baa-4311a7aa5278" containerName="dnsmasq-dns" containerID="cri-o://e88c0d7c7c40a11b03741c5d9be899580f2c590d3430272a5da08805416f43d7" gracePeriod=10 Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.049511 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j4mx8" event={"ID":"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207","Type":"ContainerStarted","Data":"553f825b26017179918aa15e17aad6de8ef28c4963820bef5b53cc7beed7a785"} Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.061600 4775 generic.go:334] "Generic (PLEG): container finished" podID="21b4aaed-0663-4480-8baa-4311a7aa5278" containerID="e88c0d7c7c40a11b03741c5d9be899580f2c590d3430272a5da08805416f43d7" exitCode=0 Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.061992 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" event={"ID":"21b4aaed-0663-4480-8baa-4311a7aa5278","Type":"ContainerDied","Data":"e88c0d7c7c40a11b03741c5d9be899580f2c590d3430272a5da08805416f43d7"} Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.064517 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b","Type":"ContainerStarted","Data":"ad1192d4f9bfce3336e0934046cf26aab5fdc6fa9de19030bc3651ed4b32b8e9"} Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.066306 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69cbb5df9f-wmvhj" event={"ID":"66ed3d02-9c61-43a8-90bf-35d00458d088","Type":"ContainerStarted","Data":"fc01dea46f074fcdbb1798c436b8e737a899386fc4a7b1834213ff1f4082525f"} Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.066567 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.079075 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-j4mx8" podStartSLOduration=3.128143365 podStartE2EDuration="49.079048788s" podCreationTimestamp="2025-12-16 15:15:01 +0000 UTC" firstStartedPulling="2025-12-16 15:15:03.216816454 +0000 UTC m=+1228.167895377" lastFinishedPulling="2025-12-16 15:15:49.167721877 +0000 UTC m=+1274.118800800" observedRunningTime="2025-12-16 15:15:50.069597801 +0000 UTC m=+1275.020676744" watchObservedRunningTime="2025-12-16 15:15:50.079048788 +0000 UTC m=+1275.030127711" Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.095504 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sw8rd" event={"ID":"de018e03-657b-4eec-8b94-2d305f9bdbcf","Type":"ContainerStarted","Data":"d9a19fea1ed0cd5dce2d31249b71ac0e2c5481f696e72971d6b95d27b3d61533"} Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.096476 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.096738 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.120830 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-69cbb5df9f-wmvhj" podStartSLOduration=8.114863639 podStartE2EDuration="8.114863639s" podCreationTimestamp="2025-12-16 15:15:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:50.107498236 +0000 UTC m=+1275.058577169" watchObservedRunningTime="2025-12-16 15:15:50.114863639 +0000 UTC m=+1275.065942562" Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.164516 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-sw8rd" podStartSLOduration=2.917602594 podStartE2EDuration="49.164491924s" podCreationTimestamp="2025-12-16 15:15:01 +0000 UTC" firstStartedPulling="2025-12-16 15:15:03.222182804 +0000 UTC m=+1228.173261727" lastFinishedPulling="2025-12-16 15:15:49.469072134 +0000 UTC m=+1274.420151057" observedRunningTime="2025-12-16 15:15:50.139534717 +0000 UTC m=+1275.090613640" watchObservedRunningTime="2025-12-16 15:15:50.164491924 +0000 UTC m=+1275.115570847" Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.311516 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.481498 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pg2c\" (UniqueName: \"kubernetes.io/projected/21b4aaed-0663-4480-8baa-4311a7aa5278-kube-api-access-5pg2c\") pod \"21b4aaed-0663-4480-8baa-4311a7aa5278\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.481559 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-config\") pod \"21b4aaed-0663-4480-8baa-4311a7aa5278\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.481741 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-dns-swift-storage-0\") pod \"21b4aaed-0663-4480-8baa-4311a7aa5278\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.481792 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-dns-svc\") pod \"21b4aaed-0663-4480-8baa-4311a7aa5278\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.481811 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-ovsdbserver-sb\") pod \"21b4aaed-0663-4480-8baa-4311a7aa5278\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.481840 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-ovsdbserver-nb\") pod \"21b4aaed-0663-4480-8baa-4311a7aa5278\" (UID: \"21b4aaed-0663-4480-8baa-4311a7aa5278\") " Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.496513 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b4aaed-0663-4480-8baa-4311a7aa5278-kube-api-access-5pg2c" (OuterVolumeSpecName: "kube-api-access-5pg2c") pod "21b4aaed-0663-4480-8baa-4311a7aa5278" (UID: "21b4aaed-0663-4480-8baa-4311a7aa5278"). InnerVolumeSpecName "kube-api-access-5pg2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.542651 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "21b4aaed-0663-4480-8baa-4311a7aa5278" (UID: "21b4aaed-0663-4480-8baa-4311a7aa5278"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.544766 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "21b4aaed-0663-4480-8baa-4311a7aa5278" (UID: "21b4aaed-0663-4480-8baa-4311a7aa5278"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.551636 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "21b4aaed-0663-4480-8baa-4311a7aa5278" (UID: "21b4aaed-0663-4480-8baa-4311a7aa5278"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.576782 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-config" (OuterVolumeSpecName: "config") pod "21b4aaed-0663-4480-8baa-4311a7aa5278" (UID: "21b4aaed-0663-4480-8baa-4311a7aa5278"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.584540 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pg2c\" (UniqueName: \"kubernetes.io/projected/21b4aaed-0663-4480-8baa-4311a7aa5278-kube-api-access-5pg2c\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.584580 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.584591 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.584601 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.584610 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.597727 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "21b4aaed-0663-4480-8baa-4311a7aa5278" (UID: "21b4aaed-0663-4480-8baa-4311a7aa5278"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:50 crc kubenswrapper[4775]: I1216 15:15:50.686803 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21b4aaed-0663-4480-8baa-4311a7aa5278-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:51 crc kubenswrapper[4775]: I1216 15:15:51.133672 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-9j6xx" event={"ID":"23611da1-3f26-42c4-bd23-36e0b04bdc24","Type":"ContainerStarted","Data":"0bb60f91a6b144e70dc4dc03c61d897e0b0e04862929f94b4ec663398674d364"} Dec 16 15:15:51 crc kubenswrapper[4775]: I1216 15:15:51.138291 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" event={"ID":"21b4aaed-0663-4480-8baa-4311a7aa5278","Type":"ContainerDied","Data":"39dd5abe2ff0b1a163295bd37407a4627cf8e05852c4c3e15b998aff1f0f5430"} Dec 16 15:15:51 crc kubenswrapper[4775]: I1216 15:15:51.138458 4775 scope.go:117] "RemoveContainer" containerID="e88c0d7c7c40a11b03741c5d9be899580f2c590d3430272a5da08805416f43d7" Dec 16 15:15:51 crc kubenswrapper[4775]: I1216 15:15:51.138829 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-nl6bm" Dec 16 15:15:51 crc kubenswrapper[4775]: I1216 15:15:51.158618 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-9j6xx" podStartSLOduration=3.1780431 podStartE2EDuration="50.158600818s" podCreationTimestamp="2025-12-16 15:15:01 +0000 UTC" firstStartedPulling="2025-12-16 15:15:02.885168881 +0000 UTC m=+1227.836247804" lastFinishedPulling="2025-12-16 15:15:49.865726599 +0000 UTC m=+1274.816805522" observedRunningTime="2025-12-16 15:15:51.150960356 +0000 UTC m=+1276.102039279" watchObservedRunningTime="2025-12-16 15:15:51.158600818 +0000 UTC m=+1276.109679741" Dec 16 15:15:51 crc kubenswrapper[4775]: I1216 15:15:51.208614 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-nl6bm"] Dec 16 15:15:51 crc kubenswrapper[4775]: I1216 15:15:51.212538 4775 scope.go:117] "RemoveContainer" containerID="a6b4ed58d6c92aa964be0edf745d5099f661a8b9a396a7f466a7b0d2ca0e6b42" Dec 16 15:15:51 crc kubenswrapper[4775]: I1216 15:15:51.224048 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-nl6bm"] Dec 16 15:15:51 crc kubenswrapper[4775]: I1216 15:15:51.352697 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b4aaed-0663-4480-8baa-4311a7aa5278" path="/var/lib/kubelet/pods/21b4aaed-0663-4480-8baa-4311a7aa5278/volumes" Dec 16 15:15:52 crc kubenswrapper[4775]: I1216 15:15:52.150155 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 15:15:52 crc kubenswrapper[4775]: I1216 15:15:52.150471 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 15:15:52 crc kubenswrapper[4775]: I1216 15:15:52.615657 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:52 crc kubenswrapper[4775]: I1216 15:15:52.793120 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:53 crc kubenswrapper[4775]: I1216 15:15:53.165197 4775 generic.go:334] "Generic (PLEG): container finished" podID="de018e03-657b-4eec-8b94-2d305f9bdbcf" containerID="d9a19fea1ed0cd5dce2d31249b71ac0e2c5481f696e72971d6b95d27b3d61533" exitCode=0 Dec 16 15:15:53 crc kubenswrapper[4775]: I1216 15:15:53.165264 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sw8rd" event={"ID":"de018e03-657b-4eec-8b94-2d305f9bdbcf","Type":"ContainerDied","Data":"d9a19fea1ed0cd5dce2d31249b71ac0e2c5481f696e72971d6b95d27b3d61533"} Dec 16 15:15:55 crc kubenswrapper[4775]: I1216 15:15:55.190006 4775 generic.go:334] "Generic (PLEG): container finished" podID="23611da1-3f26-42c4-bd23-36e0b04bdc24" containerID="0bb60f91a6b144e70dc4dc03c61d897e0b0e04862929f94b4ec663398674d364" exitCode=0 Dec 16 15:15:55 crc kubenswrapper[4775]: I1216 15:15:55.190069 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-9j6xx" event={"ID":"23611da1-3f26-42c4-bd23-36e0b04bdc24","Type":"ContainerDied","Data":"0bb60f91a6b144e70dc4dc03c61d897e0b0e04862929f94b4ec663398674d364"} Dec 16 15:15:56 crc kubenswrapper[4775]: I1216 15:15:56.203843 4775 generic.go:334] "Generic (PLEG): container finished" podID="6aae2a99-cf8f-4bdc-a5a0-4d548dcde207" containerID="553f825b26017179918aa15e17aad6de8ef28c4963820bef5b53cc7beed7a785" exitCode=0 Dec 16 15:15:56 crc kubenswrapper[4775]: I1216 15:15:56.204094 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j4mx8" event={"ID":"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207","Type":"ContainerDied","Data":"553f825b26017179918aa15e17aad6de8ef28c4963820bef5b53cc7beed7a785"} Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.645300 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sw8rd" Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.650998 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-9j6xx" Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.748136 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23611da1-3f26-42c4-bd23-36e0b04bdc24-combined-ca-bundle\") pod \"23611da1-3f26-42c4-bd23-36e0b04bdc24\" (UID: \"23611da1-3f26-42c4-bd23-36e0b04bdc24\") " Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.748246 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcp7h\" (UniqueName: \"kubernetes.io/projected/23611da1-3f26-42c4-bd23-36e0b04bdc24-kube-api-access-mcp7h\") pod \"23611da1-3f26-42c4-bd23-36e0b04bdc24\" (UID: \"23611da1-3f26-42c4-bd23-36e0b04bdc24\") " Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.748335 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23611da1-3f26-42c4-bd23-36e0b04bdc24-config-data\") pod \"23611da1-3f26-42c4-bd23-36e0b04bdc24\" (UID: \"23611da1-3f26-42c4-bd23-36e0b04bdc24\") " Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.748366 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t67wj\" (UniqueName: \"kubernetes.io/projected/de018e03-657b-4eec-8b94-2d305f9bdbcf-kube-api-access-t67wj\") pod \"de018e03-657b-4eec-8b94-2d305f9bdbcf\" (UID: \"de018e03-657b-4eec-8b94-2d305f9bdbcf\") " Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.748408 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de018e03-657b-4eec-8b94-2d305f9bdbcf-combined-ca-bundle\") pod \"de018e03-657b-4eec-8b94-2d305f9bdbcf\" (UID: \"de018e03-657b-4eec-8b94-2d305f9bdbcf\") " Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.748508 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de018e03-657b-4eec-8b94-2d305f9bdbcf-db-sync-config-data\") pod \"de018e03-657b-4eec-8b94-2d305f9bdbcf\" (UID: \"de018e03-657b-4eec-8b94-2d305f9bdbcf\") " Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.753820 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de018e03-657b-4eec-8b94-2d305f9bdbcf-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "de018e03-657b-4eec-8b94-2d305f9bdbcf" (UID: "de018e03-657b-4eec-8b94-2d305f9bdbcf"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.755199 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de018e03-657b-4eec-8b94-2d305f9bdbcf-kube-api-access-t67wj" (OuterVolumeSpecName: "kube-api-access-t67wj") pod "de018e03-657b-4eec-8b94-2d305f9bdbcf" (UID: "de018e03-657b-4eec-8b94-2d305f9bdbcf"). InnerVolumeSpecName "kube-api-access-t67wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.768686 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23611da1-3f26-42c4-bd23-36e0b04bdc24-kube-api-access-mcp7h" (OuterVolumeSpecName: "kube-api-access-mcp7h") pod "23611da1-3f26-42c4-bd23-36e0b04bdc24" (UID: "23611da1-3f26-42c4-bd23-36e0b04bdc24"). InnerVolumeSpecName "kube-api-access-mcp7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.779193 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de018e03-657b-4eec-8b94-2d305f9bdbcf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de018e03-657b-4eec-8b94-2d305f9bdbcf" (UID: "de018e03-657b-4eec-8b94-2d305f9bdbcf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.785196 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23611da1-3f26-42c4-bd23-36e0b04bdc24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23611da1-3f26-42c4-bd23-36e0b04bdc24" (UID: "23611da1-3f26-42c4-bd23-36e0b04bdc24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.827976 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23611da1-3f26-42c4-bd23-36e0b04bdc24-config-data" (OuterVolumeSpecName: "config-data") pod "23611da1-3f26-42c4-bd23-36e0b04bdc24" (UID: "23611da1-3f26-42c4-bd23-36e0b04bdc24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.850851 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23611da1-3f26-42c4-bd23-36e0b04bdc24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.850909 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcp7h\" (UniqueName: \"kubernetes.io/projected/23611da1-3f26-42c4-bd23-36e0b04bdc24-kube-api-access-mcp7h\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.850923 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23611da1-3f26-42c4-bd23-36e0b04bdc24-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.850932 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t67wj\" (UniqueName: \"kubernetes.io/projected/de018e03-657b-4eec-8b94-2d305f9bdbcf-kube-api-access-t67wj\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.850940 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de018e03-657b-4eec-8b94-2d305f9bdbcf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:57 crc kubenswrapper[4775]: I1216 15:15:57.850948 4775 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de018e03-657b-4eec-8b94-2d305f9bdbcf-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.231459 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-9j6xx" event={"ID":"23611da1-3f26-42c4-bd23-36e0b04bdc24","Type":"ContainerDied","Data":"91ad8970bd6e6ac9d0d860ef21f42bffca556f0e4a301894573601c9f96eeae1"} Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.231505 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91ad8970bd6e6ac9d0d860ef21f42bffca556f0e4a301894573601c9f96eeae1" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.231820 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-9j6xx" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.235233 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sw8rd" event={"ID":"de018e03-657b-4eec-8b94-2d305f9bdbcf","Type":"ContainerDied","Data":"14fa8132abb7910585df2d71bd2b1bb9ddb0f16440b810c9aa5b09f78ff373cc"} Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.235284 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14fa8132abb7910585df2d71bd2b1bb9ddb0f16440b810c9aa5b09f78ff373cc" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.235300 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sw8rd" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.300415 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.461844 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-combined-ca-bundle\") pod \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.461995 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-scripts\") pod \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.462056 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-config-data\") pod \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.462132 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-db-sync-config-data\") pod \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.462159 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-etc-machine-id\") pod \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.462219 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82dqt\" (UniqueName: \"kubernetes.io/projected/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-kube-api-access-82dqt\") pod \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\" (UID: \"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207\") " Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.462822 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6aae2a99-cf8f-4bdc-a5a0-4d548dcde207" (UID: "6aae2a99-cf8f-4bdc-a5a0-4d548dcde207"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.466658 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6aae2a99-cf8f-4bdc-a5a0-4d548dcde207" (UID: "6aae2a99-cf8f-4bdc-a5a0-4d548dcde207"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.467772 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-scripts" (OuterVolumeSpecName: "scripts") pod "6aae2a99-cf8f-4bdc-a5a0-4d548dcde207" (UID: "6aae2a99-cf8f-4bdc-a5a0-4d548dcde207"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.467869 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-kube-api-access-82dqt" (OuterVolumeSpecName: "kube-api-access-82dqt") pod "6aae2a99-cf8f-4bdc-a5a0-4d548dcde207" (UID: "6aae2a99-cf8f-4bdc-a5a0-4d548dcde207"). InnerVolumeSpecName "kube-api-access-82dqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.502389 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6aae2a99-cf8f-4bdc-a5a0-4d548dcde207" (UID: "6aae2a99-cf8f-4bdc-a5a0-4d548dcde207"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.529557 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-config-data" (OuterVolumeSpecName: "config-data") pod "6aae2a99-cf8f-4bdc-a5a0-4d548dcde207" (UID: "6aae2a99-cf8f-4bdc-a5a0-4d548dcde207"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.564431 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82dqt\" (UniqueName: \"kubernetes.io/projected/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-kube-api-access-82dqt\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.564460 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.564469 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.564489 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.564524 4775 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.564534 4775 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.930327 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6d485fb59c-wh26t"] Dec 16 15:15:58 crc kubenswrapper[4775]: E1216 15:15:58.931465 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de018e03-657b-4eec-8b94-2d305f9bdbcf" containerName="barbican-db-sync" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.931483 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="de018e03-657b-4eec-8b94-2d305f9bdbcf" containerName="barbican-db-sync" Dec 16 15:15:58 crc kubenswrapper[4775]: E1216 15:15:58.931502 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b4aaed-0663-4480-8baa-4311a7aa5278" containerName="dnsmasq-dns" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.931510 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b4aaed-0663-4480-8baa-4311a7aa5278" containerName="dnsmasq-dns" Dec 16 15:15:58 crc kubenswrapper[4775]: E1216 15:15:58.931520 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23611da1-3f26-42c4-bd23-36e0b04bdc24" containerName="heat-db-sync" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.931527 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="23611da1-3f26-42c4-bd23-36e0b04bdc24" containerName="heat-db-sync" Dec 16 15:15:58 crc kubenswrapper[4775]: E1216 15:15:58.931540 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aae2a99-cf8f-4bdc-a5a0-4d548dcde207" containerName="cinder-db-sync" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.931545 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aae2a99-cf8f-4bdc-a5a0-4d548dcde207" containerName="cinder-db-sync" Dec 16 15:15:58 crc kubenswrapper[4775]: E1216 15:15:58.931576 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b4aaed-0663-4480-8baa-4311a7aa5278" containerName="init" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.931582 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b4aaed-0663-4480-8baa-4311a7aa5278" containerName="init" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.931781 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b4aaed-0663-4480-8baa-4311a7aa5278" containerName="dnsmasq-dns" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.931815 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="23611da1-3f26-42c4-bd23-36e0b04bdc24" containerName="heat-db-sync" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.931824 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="de018e03-657b-4eec-8b94-2d305f9bdbcf" containerName="barbican-db-sync" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.931839 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aae2a99-cf8f-4bdc-a5a0-4d548dcde207" containerName="cinder-db-sync" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.932975 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d485fb59c-wh26t" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.936007 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.936200 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-g5kw7" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.936247 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.956341 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6c8466bf58-6vkrk"] Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.977125 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.981447 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 16 15:15:58 crc kubenswrapper[4775]: I1216 15:15:58.989700 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6c8466bf58-6vkrk"] Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.016979 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d485fb59c-wh26t"] Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.063636 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dzlmf"] Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.065479 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.074569 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dzlmf"] Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.074572 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1035b49f-bf1a-44ee-9cd4-01df93145086-config-data\") pod \"barbican-worker-6d485fb59c-wh26t\" (UID: \"1035b49f-bf1a-44ee-9cd4-01df93145086\") " pod="openstack/barbican-worker-6d485fb59c-wh26t" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.074935 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1035b49f-bf1a-44ee-9cd4-01df93145086-config-data-custom\") pod \"barbican-worker-6d485fb59c-wh26t\" (UID: \"1035b49f-bf1a-44ee-9cd4-01df93145086\") " pod="openstack/barbican-worker-6d485fb59c-wh26t" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.075038 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb2xp\" (UniqueName: \"kubernetes.io/projected/1035b49f-bf1a-44ee-9cd4-01df93145086-kube-api-access-rb2xp\") pod \"barbican-worker-6d485fb59c-wh26t\" (UID: \"1035b49f-bf1a-44ee-9cd4-01df93145086\") " pod="openstack/barbican-worker-6d485fb59c-wh26t" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.075113 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g79nl\" (UniqueName: \"kubernetes.io/projected/ee4d6d93-229d-499b-8121-123db79d7758-kube-api-access-g79nl\") pod \"barbican-keystone-listener-6c8466bf58-6vkrk\" (UID: \"ee4d6d93-229d-499b-8121-123db79d7758\") " pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.075232 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee4d6d93-229d-499b-8121-123db79d7758-logs\") pod \"barbican-keystone-listener-6c8466bf58-6vkrk\" (UID: \"ee4d6d93-229d-499b-8121-123db79d7758\") " pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.075325 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee4d6d93-229d-499b-8121-123db79d7758-config-data-custom\") pod \"barbican-keystone-listener-6c8466bf58-6vkrk\" (UID: \"ee4d6d93-229d-499b-8121-123db79d7758\") " pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.075392 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4d6d93-229d-499b-8121-123db79d7758-combined-ca-bundle\") pod \"barbican-keystone-listener-6c8466bf58-6vkrk\" (UID: \"ee4d6d93-229d-499b-8121-123db79d7758\") " pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.075442 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1035b49f-bf1a-44ee-9cd4-01df93145086-combined-ca-bundle\") pod \"barbican-worker-6d485fb59c-wh26t\" (UID: \"1035b49f-bf1a-44ee-9cd4-01df93145086\") " pod="openstack/barbican-worker-6d485fb59c-wh26t" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.075561 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee4d6d93-229d-499b-8121-123db79d7758-config-data\") pod \"barbican-keystone-listener-6c8466bf58-6vkrk\" (UID: \"ee4d6d93-229d-499b-8121-123db79d7758\") " pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.075592 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1035b49f-bf1a-44ee-9cd4-01df93145086-logs\") pod \"barbican-worker-6d485fb59c-wh26t\" (UID: \"1035b49f-bf1a-44ee-9cd4-01df93145086\") " pod="openstack/barbican-worker-6d485fb59c-wh26t" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.177080 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4d6d93-229d-499b-8121-123db79d7758-combined-ca-bundle\") pod \"barbican-keystone-listener-6c8466bf58-6vkrk\" (UID: \"ee4d6d93-229d-499b-8121-123db79d7758\") " pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.177155 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-dzlmf\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.177228 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1035b49f-bf1a-44ee-9cd4-01df93145086-combined-ca-bundle\") pod \"barbican-worker-6d485fb59c-wh26t\" (UID: \"1035b49f-bf1a-44ee-9cd4-01df93145086\") " pod="openstack/barbican-worker-6d485fb59c-wh26t" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.177253 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-dzlmf\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.177300 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnvv9\" (UniqueName: \"kubernetes.io/projected/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-kube-api-access-fnvv9\") pod \"dnsmasq-dns-85ff748b95-dzlmf\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.177920 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee4d6d93-229d-499b-8121-123db79d7758-config-data\") pod \"barbican-keystone-listener-6c8466bf58-6vkrk\" (UID: \"ee4d6d93-229d-499b-8121-123db79d7758\") " pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.177993 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1035b49f-bf1a-44ee-9cd4-01df93145086-logs\") pod \"barbican-worker-6d485fb59c-wh26t\" (UID: \"1035b49f-bf1a-44ee-9cd4-01df93145086\") " pod="openstack/barbican-worker-6d485fb59c-wh26t" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.178095 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-config\") pod \"dnsmasq-dns-85ff748b95-dzlmf\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.178168 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1035b49f-bf1a-44ee-9cd4-01df93145086-config-data\") pod \"barbican-worker-6d485fb59c-wh26t\" (UID: \"1035b49f-bf1a-44ee-9cd4-01df93145086\") " pod="openstack/barbican-worker-6d485fb59c-wh26t" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.178305 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1035b49f-bf1a-44ee-9cd4-01df93145086-config-data-custom\") pod \"barbican-worker-6d485fb59c-wh26t\" (UID: \"1035b49f-bf1a-44ee-9cd4-01df93145086\") " pod="openstack/barbican-worker-6d485fb59c-wh26t" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.178413 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb2xp\" (UniqueName: \"kubernetes.io/projected/1035b49f-bf1a-44ee-9cd4-01df93145086-kube-api-access-rb2xp\") pod \"barbican-worker-6d485fb59c-wh26t\" (UID: \"1035b49f-bf1a-44ee-9cd4-01df93145086\") " pod="openstack/barbican-worker-6d485fb59c-wh26t" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.178448 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-dzlmf\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.178523 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g79nl\" (UniqueName: \"kubernetes.io/projected/ee4d6d93-229d-499b-8121-123db79d7758-kube-api-access-g79nl\") pod \"barbican-keystone-listener-6c8466bf58-6vkrk\" (UID: \"ee4d6d93-229d-499b-8121-123db79d7758\") " pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.178586 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee4d6d93-229d-499b-8121-123db79d7758-logs\") pod \"barbican-keystone-listener-6c8466bf58-6vkrk\" (UID: \"ee4d6d93-229d-499b-8121-123db79d7758\") " pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.178679 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-dns-svc\") pod \"dnsmasq-dns-85ff748b95-dzlmf\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.178710 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee4d6d93-229d-499b-8121-123db79d7758-config-data-custom\") pod \"barbican-keystone-listener-6c8466bf58-6vkrk\" (UID: \"ee4d6d93-229d-499b-8121-123db79d7758\") " pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.180044 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1035b49f-bf1a-44ee-9cd4-01df93145086-logs\") pod \"barbican-worker-6d485fb59c-wh26t\" (UID: \"1035b49f-bf1a-44ee-9cd4-01df93145086\") " pod="openstack/barbican-worker-6d485fb59c-wh26t" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.181436 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee4d6d93-229d-499b-8121-123db79d7758-logs\") pod \"barbican-keystone-listener-6c8466bf58-6vkrk\" (UID: \"ee4d6d93-229d-499b-8121-123db79d7758\") " pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.185863 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee4d6d93-229d-499b-8121-123db79d7758-config-data-custom\") pod \"barbican-keystone-listener-6c8466bf58-6vkrk\" (UID: \"ee4d6d93-229d-499b-8121-123db79d7758\") " pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.189325 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee4d6d93-229d-499b-8121-123db79d7758-config-data\") pod \"barbican-keystone-listener-6c8466bf58-6vkrk\" (UID: \"ee4d6d93-229d-499b-8121-123db79d7758\") " pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.192278 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4d6d93-229d-499b-8121-123db79d7758-combined-ca-bundle\") pod \"barbican-keystone-listener-6c8466bf58-6vkrk\" (UID: \"ee4d6d93-229d-499b-8121-123db79d7758\") " pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.193543 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1035b49f-bf1a-44ee-9cd4-01df93145086-combined-ca-bundle\") pod \"barbican-worker-6d485fb59c-wh26t\" (UID: \"1035b49f-bf1a-44ee-9cd4-01df93145086\") " pod="openstack/barbican-worker-6d485fb59c-wh26t" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.194029 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1035b49f-bf1a-44ee-9cd4-01df93145086-config-data-custom\") pod \"barbican-worker-6d485fb59c-wh26t\" (UID: \"1035b49f-bf1a-44ee-9cd4-01df93145086\") " pod="openstack/barbican-worker-6d485fb59c-wh26t" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.212389 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1035b49f-bf1a-44ee-9cd4-01df93145086-config-data\") pod \"barbican-worker-6d485fb59c-wh26t\" (UID: \"1035b49f-bf1a-44ee-9cd4-01df93145086\") " pod="openstack/barbican-worker-6d485fb59c-wh26t" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.245761 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g79nl\" (UniqueName: \"kubernetes.io/projected/ee4d6d93-229d-499b-8121-123db79d7758-kube-api-access-g79nl\") pod \"barbican-keystone-listener-6c8466bf58-6vkrk\" (UID: \"ee4d6d93-229d-499b-8121-123db79d7758\") " pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.245958 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb2xp\" (UniqueName: \"kubernetes.io/projected/1035b49f-bf1a-44ee-9cd4-01df93145086-kube-api-access-rb2xp\") pod \"barbican-worker-6d485fb59c-wh26t\" (UID: \"1035b49f-bf1a-44ee-9cd4-01df93145086\") " pod="openstack/barbican-worker-6d485fb59c-wh26t" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.273435 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d485fb59c-wh26t" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.282657 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-dns-svc\") pod \"dnsmasq-dns-85ff748b95-dzlmf\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.282738 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-dzlmf\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.282771 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-dzlmf\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.282809 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnvv9\" (UniqueName: \"kubernetes.io/projected/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-kube-api-access-fnvv9\") pod \"dnsmasq-dns-85ff748b95-dzlmf\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.282865 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-config\") pod \"dnsmasq-dns-85ff748b95-dzlmf\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.282990 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-dzlmf\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.283984 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-dns-svc\") pod \"dnsmasq-dns-85ff748b95-dzlmf\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.284092 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-dzlmf\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.284931 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-config\") pod \"dnsmasq-dns-85ff748b95-dzlmf\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.285218 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-dzlmf\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.287114 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-dzlmf\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.307006 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.321227 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b","Type":"ContainerStarted","Data":"3c6a86b5ed43347643f221f5902e20865200132a93844894e65cda6fec1a5860"} Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.321497 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerName="ceilometer-central-agent" containerID="cri-o://9e73013142ca6f6ab9e7af148b0250deeda8b2986cc2a0da4fc90089684cbb7f" gracePeriod=30 Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.321581 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.321988 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerName="proxy-httpd" containerID="cri-o://3c6a86b5ed43347643f221f5902e20865200132a93844894e65cda6fec1a5860" gracePeriod=30 Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.322040 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerName="sg-core" containerID="cri-o://ad1192d4f9bfce3336e0934046cf26aab5fdc6fa9de19030bc3651ed4b32b8e9" gracePeriod=30 Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.322079 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerName="ceilometer-notification-agent" containerID="cri-o://8ccd8ca819b4e266a04faad0a0625b9006c3cb7c757ece3b5742cd732f812992" gracePeriod=30 Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.329319 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-779fdf9558-tvqbl"] Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.329724 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnvv9\" (UniqueName: \"kubernetes.io/projected/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-kube-api-access-fnvv9\") pod \"dnsmasq-dns-85ff748b95-dzlmf\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.347184 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.357525 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j4mx8" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.369546 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.385395 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.386558 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.241453865 podStartE2EDuration="58.386541405s" podCreationTimestamp="2025-12-16 15:15:01 +0000 UTC" firstStartedPulling="2025-12-16 15:15:03.407700351 +0000 UTC m=+1228.358779274" lastFinishedPulling="2025-12-16 15:15:58.552787891 +0000 UTC m=+1283.503866814" observedRunningTime="2025-12-16 15:15:59.385531723 +0000 UTC m=+1284.336610646" watchObservedRunningTime="2025-12-16 15:15:59.386541405 +0000 UTC m=+1284.337620328" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.408728 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j4mx8" event={"ID":"6aae2a99-cf8f-4bdc-a5a0-4d548dcde207","Type":"ContainerDied","Data":"efeebc81d4ebaa959f273509057f99e8dc0db1be698af1c35836702c098eacb5"} Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.408773 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efeebc81d4ebaa959f273509057f99e8dc0db1be698af1c35836702c098eacb5" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.408787 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-779fdf9558-tvqbl"] Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.492539 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a040578-d21e-448d-b653-15547567f335-logs\") pod \"barbican-api-779fdf9558-tvqbl\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.492662 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a040578-d21e-448d-b653-15547567f335-config-data-custom\") pod \"barbican-api-779fdf9558-tvqbl\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.492697 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a040578-d21e-448d-b653-15547567f335-combined-ca-bundle\") pod \"barbican-api-779fdf9558-tvqbl\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.492734 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a040578-d21e-448d-b653-15547567f335-config-data\") pod \"barbican-api-779fdf9558-tvqbl\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.492790 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mkh6\" (UniqueName: \"kubernetes.io/projected/1a040578-d21e-448d-b653-15547567f335-kube-api-access-2mkh6\") pod \"barbican-api-779fdf9558-tvqbl\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.594110 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a040578-d21e-448d-b653-15547567f335-config-data-custom\") pod \"barbican-api-779fdf9558-tvqbl\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.594524 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a040578-d21e-448d-b653-15547567f335-combined-ca-bundle\") pod \"barbican-api-779fdf9558-tvqbl\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.594551 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a040578-d21e-448d-b653-15547567f335-config-data\") pod \"barbican-api-779fdf9558-tvqbl\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.594578 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mkh6\" (UniqueName: \"kubernetes.io/projected/1a040578-d21e-448d-b653-15547567f335-kube-api-access-2mkh6\") pod \"barbican-api-779fdf9558-tvqbl\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.594685 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a040578-d21e-448d-b653-15547567f335-logs\") pod \"barbican-api-779fdf9558-tvqbl\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.595298 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a040578-d21e-448d-b653-15547567f335-logs\") pod \"barbican-api-779fdf9558-tvqbl\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.605160 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a040578-d21e-448d-b653-15547567f335-config-data-custom\") pod \"barbican-api-779fdf9558-tvqbl\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.611099 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a040578-d21e-448d-b653-15547567f335-config-data\") pod \"barbican-api-779fdf9558-tvqbl\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.613776 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a040578-d21e-448d-b653-15547567f335-combined-ca-bundle\") pod \"barbican-api-779fdf9558-tvqbl\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.631098 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mkh6\" (UniqueName: \"kubernetes.io/projected/1a040578-d21e-448d-b653-15547567f335-kube-api-access-2mkh6\") pod \"barbican-api-779fdf9558-tvqbl\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.719374 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.726066 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.729356 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.730181 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.730268 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.730375 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.730614 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.730715 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9nnnp" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.808348 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.808456 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e34c59d0-d2cd-41cd-990a-bed8a44d1230-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.808532 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-scripts\") pod \"cinder-scheduler-0\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.808555 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjxct\" (UniqueName: \"kubernetes.io/projected/e34c59d0-d2cd-41cd-990a-bed8a44d1230-kube-api-access-bjxct\") pod \"cinder-scheduler-0\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.808586 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-config-data\") pod \"cinder-scheduler-0\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.814378 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.921443 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-scripts\") pod \"cinder-scheduler-0\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.921510 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjxct\" (UniqueName: \"kubernetes.io/projected/e34c59d0-d2cd-41cd-990a-bed8a44d1230-kube-api-access-bjxct\") pod \"cinder-scheduler-0\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.921554 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-config-data\") pod \"cinder-scheduler-0\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.921624 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.921714 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.921758 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e34c59d0-d2cd-41cd-990a-bed8a44d1230-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.938791 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e34c59d0-d2cd-41cd-990a-bed8a44d1230-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.947666 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-scripts\") pod \"cinder-scheduler-0\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.966038 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.969988 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjxct\" (UniqueName: \"kubernetes.io/projected/e34c59d0-d2cd-41cd-990a-bed8a44d1230-kube-api-access-bjxct\") pod \"cinder-scheduler-0\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.976283 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.980534 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dzlmf"] Dec 16 15:15:59 crc kubenswrapper[4775]: I1216 15:15:59.980993 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-config-data\") pod \"cinder-scheduler-0\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.043948 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-2xd25"] Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.045549 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.053020 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-2xd25"] Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.064006 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d485fb59c-wh26t"] Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.065849 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.072441 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.076296 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.080220 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.087907 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.127952 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-2xd25\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.128038 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-2xd25\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.128097 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-2xd25\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.128129 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-config\") pod \"dnsmasq-dns-5c9776ccc5-2xd25\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.128150 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-2xd25\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.128165 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mglgp\" (UniqueName: \"kubernetes.io/projected/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-kube-api-access-mglgp\") pod \"dnsmasq-dns-5c9776ccc5-2xd25\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.229643 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dzlmf"] Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.231112 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a5c773c-3b58-405f-a31f-b4b872509e1e-logs\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.231201 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-2xd25\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.231311 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.231383 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-config-data-custom\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.231442 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-2xd25\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.231578 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-2xd25\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.231611 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-config-data\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.231656 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a5c773c-3b58-405f-a31f-b4b872509e1e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.231734 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-config\") pod \"dnsmasq-dns-5c9776ccc5-2xd25\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.231773 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-2xd25\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.231820 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mglgp\" (UniqueName: \"kubernetes.io/projected/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-kube-api-access-mglgp\") pod \"dnsmasq-dns-5c9776ccc5-2xd25\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.231852 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxkpk\" (UniqueName: \"kubernetes.io/projected/8a5c773c-3b58-405f-a31f-b4b872509e1e-kube-api-access-cxkpk\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.231904 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-scripts\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.233557 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-2xd25\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.233626 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-config\") pod \"dnsmasq-dns-5c9776ccc5-2xd25\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.234110 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-2xd25\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.234843 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-2xd25\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.235501 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-2xd25\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: W1216 15:16:00.241200 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1d6d8e3_75bb_4cbc_92e7_9a245f490030.slice/crio-a0a6166dcb51c9d00fba00a05ec9439bf1fc0cea7838e5cdb8ffd3cda39c9937 WatchSource:0}: Error finding container a0a6166dcb51c9d00fba00a05ec9439bf1fc0cea7838e5cdb8ffd3cda39c9937: Status 404 returned error can't find the container with id a0a6166dcb51c9d00fba00a05ec9439bf1fc0cea7838e5cdb8ffd3cda39c9937 Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.245274 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6c8466bf58-6vkrk"] Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.264760 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mglgp\" (UniqueName: \"kubernetes.io/projected/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-kube-api-access-mglgp\") pod \"dnsmasq-dns-5c9776ccc5-2xd25\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.334016 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxkpk\" (UniqueName: \"kubernetes.io/projected/8a5c773c-3b58-405f-a31f-b4b872509e1e-kube-api-access-cxkpk\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.334362 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-scripts\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.334394 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a5c773c-3b58-405f-a31f-b4b872509e1e-logs\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.334466 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.334498 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-config-data-custom\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.334577 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-config-data\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.334598 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a5c773c-3b58-405f-a31f-b4b872509e1e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.334684 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a5c773c-3b58-405f-a31f-b4b872509e1e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.336572 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a5c773c-3b58-405f-a31f-b4b872509e1e-logs\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.338359 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-scripts\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.339442 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.340530 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-config-data-custom\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.344134 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-config-data\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.353170 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxkpk\" (UniqueName: \"kubernetes.io/projected/8a5c773c-3b58-405f-a31f-b4b872509e1e-kube-api-access-cxkpk\") pod \"cinder-api-0\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.393549 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.396800 4775 generic.go:334] "Generic (PLEG): container finished" podID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerID="3c6a86b5ed43347643f221f5902e20865200132a93844894e65cda6fec1a5860" exitCode=0 Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.396847 4775 generic.go:334] "Generic (PLEG): container finished" podID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerID="ad1192d4f9bfce3336e0934046cf26aab5fdc6fa9de19030bc3651ed4b32b8e9" exitCode=2 Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.396859 4775 generic.go:334] "Generic (PLEG): container finished" podID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerID="9e73013142ca6f6ab9e7af148b0250deeda8b2986cc2a0da4fc90089684cbb7f" exitCode=0 Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.396954 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b","Type":"ContainerDied","Data":"3c6a86b5ed43347643f221f5902e20865200132a93844894e65cda6fec1a5860"} Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.396992 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b","Type":"ContainerDied","Data":"ad1192d4f9bfce3336e0934046cf26aab5fdc6fa9de19030bc3651ed4b32b8e9"} Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.397008 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b","Type":"ContainerDied","Data":"9e73013142ca6f6ab9e7af148b0250deeda8b2986cc2a0da4fc90089684cbb7f"} Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.404351 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.408550 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" event={"ID":"d1d6d8e3-75bb-4cbc-92e7-9a245f490030","Type":"ContainerStarted","Data":"a0a6166dcb51c9d00fba00a05ec9439bf1fc0cea7838e5cdb8ffd3cda39c9937"} Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.413244 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d485fb59c-wh26t" event={"ID":"1035b49f-bf1a-44ee-9cd4-01df93145086","Type":"ContainerStarted","Data":"05b2519970aaeace6a16d3e9d1d9788f20562a2e5c25ae3e09bc64d1f1152ef3"} Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.414691 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" event={"ID":"ee4d6d93-229d-499b-8121-123db79d7758","Type":"ContainerStarted","Data":"e038b89fee900f1ebf428d9f5866c8c8fa39bf45347b39c2b7aefd5d98e52200"} Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.522417 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-779fdf9558-tvqbl"] Dec 16 15:16:00 crc kubenswrapper[4775]: W1216 15:16:00.595840 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a040578_d21e_448d_b653_15547567f335.slice/crio-1d42bdc28fb0a15a30167e836b7ce0e6be45f53b1afcb0919543d4b653e2bfa7 WatchSource:0}: Error finding container 1d42bdc28fb0a15a30167e836b7ce0e6be45f53b1afcb0919543d4b653e2bfa7: Status 404 returned error can't find the container with id 1d42bdc28fb0a15a30167e836b7ce0e6be45f53b1afcb0919543d4b653e2bfa7 Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.678967 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.823283 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-2xd25"] Dec 16 15:16:00 crc kubenswrapper[4775]: W1216 15:16:00.823401 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae6daf4a_5550_44e9_a0bd_11bc6527ad5d.slice/crio-96a7479c014ec9840606b8bf2adf34b2116c96c95f7e85bce69f5a4d7639f887 WatchSource:0}: Error finding container 96a7479c014ec9840606b8bf2adf34b2116c96c95f7e85bce69f5a4d7639f887: Status 404 returned error can't find the container with id 96a7479c014ec9840606b8bf2adf34b2116c96c95f7e85bce69f5a4d7639f887 Dec 16 15:16:00 crc kubenswrapper[4775]: I1216 15:16:00.891175 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 15:16:00 crc kubenswrapper[4775]: W1216 15:16:00.955084 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a5c773c_3b58_405f_a31f_b4b872509e1e.slice/crio-10100b52842afe1614a745ae92d50c91a6f770ff8840991ae509167e987b18d1 WatchSource:0}: Error finding container 10100b52842afe1614a745ae92d50c91a6f770ff8840991ae509167e987b18d1: Status 404 returned error can't find the container with id 10100b52842afe1614a745ae92d50c91a6f770ff8840991ae509167e987b18d1 Dec 16 15:16:01 crc kubenswrapper[4775]: I1216 15:16:01.427159 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-779fdf9558-tvqbl" event={"ID":"1a040578-d21e-448d-b653-15547567f335","Type":"ContainerStarted","Data":"25e112847f41c5698171e2bcc24b3845e7f4a7b459592f7a07e46c23b81ad450"} Dec 16 15:16:01 crc kubenswrapper[4775]: I1216 15:16:01.427562 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:16:01 crc kubenswrapper[4775]: I1216 15:16:01.427575 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-779fdf9558-tvqbl" event={"ID":"1a040578-d21e-448d-b653-15547567f335","Type":"ContainerStarted","Data":"d7c8fcbc22d506c3d006fd714b4a2a0e7730446f5aecd05518e3ca018538c339"} Dec 16 15:16:01 crc kubenswrapper[4775]: I1216 15:16:01.427586 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-779fdf9558-tvqbl" event={"ID":"1a040578-d21e-448d-b653-15547567f335","Type":"ContainerStarted","Data":"1d42bdc28fb0a15a30167e836b7ce0e6be45f53b1afcb0919543d4b653e2bfa7"} Dec 16 15:16:01 crc kubenswrapper[4775]: I1216 15:16:01.429777 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a5c773c-3b58-405f-a31f-b4b872509e1e","Type":"ContainerStarted","Data":"10100b52842afe1614a745ae92d50c91a6f770ff8840991ae509167e987b18d1"} Dec 16 15:16:01 crc kubenswrapper[4775]: I1216 15:16:01.431619 4775 generic.go:334] "Generic (PLEG): container finished" podID="ae6daf4a-5550-44e9-a0bd-11bc6527ad5d" containerID="83b1d8fcd134c9654aa43e1ac362b72dd4ae4d8f06f5703b7cc23d95b9c668a9" exitCode=0 Dec 16 15:16:01 crc kubenswrapper[4775]: I1216 15:16:01.431663 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" event={"ID":"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d","Type":"ContainerDied","Data":"83b1d8fcd134c9654aa43e1ac362b72dd4ae4d8f06f5703b7cc23d95b9c668a9"} Dec 16 15:16:01 crc kubenswrapper[4775]: I1216 15:16:01.431713 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" event={"ID":"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d","Type":"ContainerStarted","Data":"96a7479c014ec9840606b8bf2adf34b2116c96c95f7e85bce69f5a4d7639f887"} Dec 16 15:16:01 crc kubenswrapper[4775]: I1216 15:16:01.433483 4775 generic.go:334] "Generic (PLEG): container finished" podID="d1d6d8e3-75bb-4cbc-92e7-9a245f490030" containerID="1ee87870db72b739c5535854007c018c610c5dfb3b09e8f7dc5e9c17e946c3fd" exitCode=0 Dec 16 15:16:01 crc kubenswrapper[4775]: I1216 15:16:01.433518 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" event={"ID":"d1d6d8e3-75bb-4cbc-92e7-9a245f490030","Type":"ContainerDied","Data":"1ee87870db72b739c5535854007c018c610c5dfb3b09e8f7dc5e9c17e946c3fd"} Dec 16 15:16:01 crc kubenswrapper[4775]: I1216 15:16:01.435905 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e34c59d0-d2cd-41cd-990a-bed8a44d1230","Type":"ContainerStarted","Data":"2bb4272982f64bfbeef5c9e9895ae32850810f1c7df7ba3c935ca07209a31486"} Dec 16 15:16:01 crc kubenswrapper[4775]: I1216 15:16:01.483304 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-779fdf9558-tvqbl" podStartSLOduration=2.483277456 podStartE2EDuration="2.483277456s" podCreationTimestamp="2025-12-16 15:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:01.451567695 +0000 UTC m=+1286.402646628" watchObservedRunningTime="2025-12-16 15:16:01.483277456 +0000 UTC m=+1286.434356379" Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.128185 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.221131 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-dns-swift-storage-0\") pod \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.221213 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-config\") pod \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.221237 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnvv9\" (UniqueName: \"kubernetes.io/projected/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-kube-api-access-fnvv9\") pod \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.221297 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-ovsdbserver-sb\") pod \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.221352 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-ovsdbserver-nb\") pod \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.221493 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-dns-svc\") pod \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\" (UID: \"d1d6d8e3-75bb-4cbc-92e7-9a245f490030\") " Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.228152 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-kube-api-access-fnvv9" (OuterVolumeSpecName: "kube-api-access-fnvv9") pod "d1d6d8e3-75bb-4cbc-92e7-9a245f490030" (UID: "d1d6d8e3-75bb-4cbc-92e7-9a245f490030"). InnerVolumeSpecName "kube-api-access-fnvv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.249557 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d1d6d8e3-75bb-4cbc-92e7-9a245f490030" (UID: "d1d6d8e3-75bb-4cbc-92e7-9a245f490030"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.255515 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d1d6d8e3-75bb-4cbc-92e7-9a245f490030" (UID: "d1d6d8e3-75bb-4cbc-92e7-9a245f490030"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.261181 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-config" (OuterVolumeSpecName: "config") pod "d1d6d8e3-75bb-4cbc-92e7-9a245f490030" (UID: "d1d6d8e3-75bb-4cbc-92e7-9a245f490030"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.269206 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d1d6d8e3-75bb-4cbc-92e7-9a245f490030" (UID: "d1d6d8e3-75bb-4cbc-92e7-9a245f490030"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.281445 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d1d6d8e3-75bb-4cbc-92e7-9a245f490030" (UID: "d1d6d8e3-75bb-4cbc-92e7-9a245f490030"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.324021 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.324257 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.324330 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.324399 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnvv9\" (UniqueName: \"kubernetes.io/projected/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-kube-api-access-fnvv9\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.324467 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.324541 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1d6d8e3-75bb-4cbc-92e7-9a245f490030-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.462934 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a5c773c-3b58-405f-a31f-b4b872509e1e","Type":"ContainerStarted","Data":"4634da530ff6bbd47268c50143df7998ac4625ef3df4f95e643d20e3724931a1"} Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.466027 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" event={"ID":"d1d6d8e3-75bb-4cbc-92e7-9a245f490030","Type":"ContainerDied","Data":"a0a6166dcb51c9d00fba00a05ec9439bf1fc0cea7838e5cdb8ffd3cda39c9937"} Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.466098 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.466094 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dzlmf" Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.466118 4775 scope.go:117] "RemoveContainer" containerID="1ee87870db72b739c5535854007c018c610c5dfb3b09e8f7dc5e9c17e946c3fd" Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.537457 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dzlmf"] Dec 16 15:16:02 crc kubenswrapper[4775]: I1216 15:16:02.545139 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dzlmf"] Dec 16 15:16:03 crc kubenswrapper[4775]: I1216 15:16:03.169274 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 15:16:03 crc kubenswrapper[4775]: I1216 15:16:03.350255 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1d6d8e3-75bb-4cbc-92e7-9a245f490030" path="/var/lib/kubelet/pods/d1d6d8e3-75bb-4cbc-92e7-9a245f490030/volumes" Dec 16 15:16:03 crc kubenswrapper[4775]: I1216 15:16:03.484595 4775 generic.go:334] "Generic (PLEG): container finished" podID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerID="8ccd8ca819b4e266a04faad0a0625b9006c3cb7c757ece3b5742cd732f812992" exitCode=0 Dec 16 15:16:03 crc kubenswrapper[4775]: I1216 15:16:03.484626 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b","Type":"ContainerDied","Data":"8ccd8ca819b4e266a04faad0a0625b9006c3cb7c757ece3b5742cd732f812992"} Dec 16 15:16:03 crc kubenswrapper[4775]: I1216 15:16:03.486372 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" event={"ID":"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d","Type":"ContainerStarted","Data":"93d9517139b3988374248ab712d91f2f749fed150a9bd8239d3c3e9608d0978a"} Dec 16 15:16:03 crc kubenswrapper[4775]: I1216 15:16:03.486506 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:03 crc kubenswrapper[4775]: I1216 15:16:03.487790 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a5c773c-3b58-405f-a31f-b4b872509e1e","Type":"ContainerStarted","Data":"0ca9e71065dce630b808e7bc5b992ab95e7313eea2afb89b56d8fd5f2b3bee47"} Dec 16 15:16:03 crc kubenswrapper[4775]: I1216 15:16:03.487937 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8a5c773c-3b58-405f-a31f-b4b872509e1e" containerName="cinder-api-log" containerID="cri-o://4634da530ff6bbd47268c50143df7998ac4625ef3df4f95e643d20e3724931a1" gracePeriod=30 Dec 16 15:16:03 crc kubenswrapper[4775]: I1216 15:16:03.488203 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 16 15:16:03 crc kubenswrapper[4775]: I1216 15:16:03.488250 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8a5c773c-3b58-405f-a31f-b4b872509e1e" containerName="cinder-api" containerID="cri-o://0ca9e71065dce630b808e7bc5b992ab95e7313eea2afb89b56d8fd5f2b3bee47" gracePeriod=30 Dec 16 15:16:03 crc kubenswrapper[4775]: I1216 15:16:03.494429 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d485fb59c-wh26t" event={"ID":"1035b49f-bf1a-44ee-9cd4-01df93145086","Type":"ContainerStarted","Data":"900ac7af22df73c9fdd662102871fa476399336eac2baed8ba52fcdc1cf6ae13"} Dec 16 15:16:03 crc kubenswrapper[4775]: I1216 15:16:03.496476 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" event={"ID":"ee4d6d93-229d-499b-8121-123db79d7758","Type":"ContainerStarted","Data":"09005c4bd339fbd5cf88e11988173815b1396df3b7c56ae0898b4fd062be7653"} Dec 16 15:16:03 crc kubenswrapper[4775]: I1216 15:16:03.496529 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" event={"ID":"ee4d6d93-229d-499b-8121-123db79d7758","Type":"ContainerStarted","Data":"096d6512a0a6744bc30e3cf5742efeaa4d022286ad0230a99a1bfdcfc540722e"} Dec 16 15:16:03 crc kubenswrapper[4775]: I1216 15:16:03.508217 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" podStartSLOduration=4.508192212 podStartE2EDuration="4.508192212s" podCreationTimestamp="2025-12-16 15:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:03.505311801 +0000 UTC m=+1288.456390744" watchObservedRunningTime="2025-12-16 15:16:03.508192212 +0000 UTC m=+1288.459271135" Dec 16 15:16:03 crc kubenswrapper[4775]: I1216 15:16:03.539475 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6c8466bf58-6vkrk" podStartSLOduration=3.123600409 podStartE2EDuration="5.539449277s" podCreationTimestamp="2025-12-16 15:15:58 +0000 UTC" firstStartedPulling="2025-12-16 15:16:00.252515486 +0000 UTC m=+1285.203594409" lastFinishedPulling="2025-12-16 15:16:02.668364354 +0000 UTC m=+1287.619443277" observedRunningTime="2025-12-16 15:16:03.529128781 +0000 UTC m=+1288.480207714" watchObservedRunningTime="2025-12-16 15:16:03.539449277 +0000 UTC m=+1288.490528200" Dec 16 15:16:03 crc kubenswrapper[4775]: I1216 15:16:03.570836 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.570814727 podStartE2EDuration="4.570814727s" podCreationTimestamp="2025-12-16 15:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:03.548395709 +0000 UTC m=+1288.499474652" watchObservedRunningTime="2025-12-16 15:16:03.570814727 +0000 UTC m=+1288.521893650" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.006952 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.170932 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-scripts\") pod \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.171182 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-log-httpd\") pod \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.171212 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-combined-ca-bundle\") pod \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.171268 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vlnk\" (UniqueName: \"kubernetes.io/projected/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-kube-api-access-4vlnk\") pod \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.171304 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-config-data\") pod \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.171335 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-sg-core-conf-yaml\") pod \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.171378 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-run-httpd\") pod \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\" (UID: \"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b\") " Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.172529 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" (UID: "6b7de359-0a4d-4b1c-9ab1-1cad66c5877b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.175332 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" (UID: "6b7de359-0a4d-4b1c-9ab1-1cad66c5877b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.177971 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-kube-api-access-4vlnk" (OuterVolumeSpecName: "kube-api-access-4vlnk") pod "6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" (UID: "6b7de359-0a4d-4b1c-9ab1-1cad66c5877b"). InnerVolumeSpecName "kube-api-access-4vlnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.179105 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-scripts" (OuterVolumeSpecName: "scripts") pod "6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" (UID: "6b7de359-0a4d-4b1c-9ab1-1cad66c5877b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.204703 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" (UID: "6b7de359-0a4d-4b1c-9ab1-1cad66c5877b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.273077 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.273107 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vlnk\" (UniqueName: \"kubernetes.io/projected/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-kube-api-access-4vlnk\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.273120 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.273131 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.273139 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.302060 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-config-data" (OuterVolumeSpecName: "config-data") pod "6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" (UID: "6b7de359-0a4d-4b1c-9ab1-1cad66c5877b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.302171 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" (UID: "6b7de359-0a4d-4b1c-9ab1-1cad66c5877b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.375288 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.375342 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.624206 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d485fb59c-wh26t" event={"ID":"1035b49f-bf1a-44ee-9cd4-01df93145086","Type":"ContainerStarted","Data":"d3c4674d1e84f7d49a84db9bb30193292de6c61c45a663afae3c6dd4c9089393"} Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.666315 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6d485fb59c-wh26t" podStartSLOduration=4.023304434 podStartE2EDuration="6.666295529s" podCreationTimestamp="2025-12-16 15:15:58 +0000 UTC" firstStartedPulling="2025-12-16 15:15:59.998660807 +0000 UTC m=+1284.949739730" lastFinishedPulling="2025-12-16 15:16:02.641651892 +0000 UTC m=+1287.592730825" observedRunningTime="2025-12-16 15:16:04.663189361 +0000 UTC m=+1289.614268284" watchObservedRunningTime="2025-12-16 15:16:04.666295529 +0000 UTC m=+1289.617374452" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.671216 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b7de359-0a4d-4b1c-9ab1-1cad66c5877b","Type":"ContainerDied","Data":"1a90c21a17521237f5bb9900d3cf2c8187c6699b114ca8912efdd100cf43bc3a"} Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.671282 4775 scope.go:117] "RemoveContainer" containerID="3c6a86b5ed43347643f221f5902e20865200132a93844894e65cda6fec1a5860" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.671407 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.687061 4775 generic.go:334] "Generic (PLEG): container finished" podID="8a5c773c-3b58-405f-a31f-b4b872509e1e" containerID="4634da530ff6bbd47268c50143df7998ac4625ef3df4f95e643d20e3724931a1" exitCode=143 Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.687093 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a5c773c-3b58-405f-a31f-b4b872509e1e","Type":"ContainerDied","Data":"4634da530ff6bbd47268c50143df7998ac4625ef3df4f95e643d20e3724931a1"} Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.730819 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.736159 4775 scope.go:117] "RemoveContainer" containerID="ad1192d4f9bfce3336e0934046cf26aab5fdc6fa9de19030bc3651ed4b32b8e9" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.739586 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.755475 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:04 crc kubenswrapper[4775]: E1216 15:16:04.755922 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerName="sg-core" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.755939 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerName="sg-core" Dec 16 15:16:04 crc kubenswrapper[4775]: E1216 15:16:04.755948 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1d6d8e3-75bb-4cbc-92e7-9a245f490030" containerName="init" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.755953 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d6d8e3-75bb-4cbc-92e7-9a245f490030" containerName="init" Dec 16 15:16:04 crc kubenswrapper[4775]: E1216 15:16:04.755967 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerName="ceilometer-central-agent" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.755973 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerName="ceilometer-central-agent" Dec 16 15:16:04 crc kubenswrapper[4775]: E1216 15:16:04.755988 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerName="ceilometer-notification-agent" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.755996 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerName="ceilometer-notification-agent" Dec 16 15:16:04 crc kubenswrapper[4775]: E1216 15:16:04.756009 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerName="proxy-httpd" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.756015 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerName="proxy-httpd" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.756195 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1d6d8e3-75bb-4cbc-92e7-9a245f490030" containerName="init" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.756208 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerName="sg-core" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.756219 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerName="ceilometer-notification-agent" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.756228 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerName="proxy-httpd" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.756239 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" containerName="ceilometer-central-agent" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.757829 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.761628 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.761830 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.768102 4775 scope.go:117] "RemoveContainer" containerID="8ccd8ca819b4e266a04faad0a0625b9006c3cb7c757ece3b5742cd732f812992" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.775066 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.825680 4775 scope.go:117] "RemoveContainer" containerID="9e73013142ca6f6ab9e7af148b0250deeda8b2986cc2a0da4fc90089684cbb7f" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.889114 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smnrg\" (UniqueName: \"kubernetes.io/projected/4b12d393-83d8-4db9-8aee-517105ff8484-kube-api-access-smnrg\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.889215 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-config-data\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.889242 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-scripts\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.889280 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b12d393-83d8-4db9-8aee-517105ff8484-log-httpd\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.889301 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.889358 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.889380 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b12d393-83d8-4db9-8aee-517105ff8484-run-httpd\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.990950 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.991115 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b12d393-83d8-4db9-8aee-517105ff8484-run-httpd\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.991173 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smnrg\" (UniqueName: \"kubernetes.io/projected/4b12d393-83d8-4db9-8aee-517105ff8484-kube-api-access-smnrg\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.991244 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-config-data\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.991277 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-scripts\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.991333 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b12d393-83d8-4db9-8aee-517105ff8484-log-httpd\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.991365 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.992981 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b12d393-83d8-4db9-8aee-517105ff8484-log-httpd\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:04 crc kubenswrapper[4775]: I1216 15:16:04.993012 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b12d393-83d8-4db9-8aee-517105ff8484-run-httpd\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:05 crc kubenswrapper[4775]: I1216 15:16:05.001815 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:05 crc kubenswrapper[4775]: I1216 15:16:05.001896 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-config-data\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:05 crc kubenswrapper[4775]: I1216 15:16:05.014327 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-scripts\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:05 crc kubenswrapper[4775]: I1216 15:16:05.014509 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:05 crc kubenswrapper[4775]: I1216 15:16:05.016587 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smnrg\" (UniqueName: \"kubernetes.io/projected/4b12d393-83d8-4db9-8aee-517105ff8484-kube-api-access-smnrg\") pod \"ceilometer-0\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " pod="openstack/ceilometer-0" Dec 16 15:16:05 crc kubenswrapper[4775]: I1216 15:16:05.086597 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:05 crc kubenswrapper[4775]: I1216 15:16:05.355210 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b7de359-0a4d-4b1c-9ab1-1cad66c5877b" path="/var/lib/kubelet/pods/6b7de359-0a4d-4b1c-9ab1-1cad66c5877b/volumes" Dec 16 15:16:05 crc kubenswrapper[4775]: I1216 15:16:05.573311 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:05 crc kubenswrapper[4775]: W1216 15:16:05.574503 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b12d393_83d8_4db9_8aee_517105ff8484.slice/crio-e080563d60cae6b51da7793507eba07acbe648bbee40ae0185865ffc41d0c36f WatchSource:0}: Error finding container e080563d60cae6b51da7793507eba07acbe648bbee40ae0185865ffc41d0c36f: Status 404 returned error can't find the container with id e080563d60cae6b51da7793507eba07acbe648bbee40ae0185865ffc41d0c36f Dec 16 15:16:05 crc kubenswrapper[4775]: I1216 15:16:05.701127 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b12d393-83d8-4db9-8aee-517105ff8484","Type":"ContainerStarted","Data":"e080563d60cae6b51da7793507eba07acbe648bbee40ae0185865ffc41d0c36f"} Dec 16 15:16:05 crc kubenswrapper[4775]: I1216 15:16:05.984775 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5cdf7cd694-pv7bs"] Dec 16 15:16:05 crc kubenswrapper[4775]: I1216 15:16:05.988083 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:05 crc kubenswrapper[4775]: I1216 15:16:05.990108 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 16 15:16:05 crc kubenswrapper[4775]: I1216 15:16:05.990692 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.004183 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5cdf7cd694-pv7bs"] Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.110670 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38f1f660-5367-4db0-a653-c72807682175-logs\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.111100 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38f1f660-5367-4db0-a653-c72807682175-config-data-custom\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.111224 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f1f660-5367-4db0-a653-c72807682175-config-data\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.111313 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7gfp\" (UniqueName: \"kubernetes.io/projected/38f1f660-5367-4db0-a653-c72807682175-kube-api-access-s7gfp\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.111461 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38f1f660-5367-4db0-a653-c72807682175-internal-tls-certs\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.111673 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f1f660-5367-4db0-a653-c72807682175-combined-ca-bundle\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.111863 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38f1f660-5367-4db0-a653-c72807682175-public-tls-certs\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.213187 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f1f660-5367-4db0-a653-c72807682175-combined-ca-bundle\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.213305 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38f1f660-5367-4db0-a653-c72807682175-public-tls-certs\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.213336 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38f1f660-5367-4db0-a653-c72807682175-logs\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.213361 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38f1f660-5367-4db0-a653-c72807682175-config-data-custom\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.213397 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f1f660-5367-4db0-a653-c72807682175-config-data\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.213432 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7gfp\" (UniqueName: \"kubernetes.io/projected/38f1f660-5367-4db0-a653-c72807682175-kube-api-access-s7gfp\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.213473 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38f1f660-5367-4db0-a653-c72807682175-internal-tls-certs\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.213945 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38f1f660-5367-4db0-a653-c72807682175-logs\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.220428 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38f1f660-5367-4db0-a653-c72807682175-internal-tls-certs\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.221464 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f1f660-5367-4db0-a653-c72807682175-config-data\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.223060 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f1f660-5367-4db0-a653-c72807682175-combined-ca-bundle\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.223698 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38f1f660-5367-4db0-a653-c72807682175-config-data-custom\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.224582 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38f1f660-5367-4db0-a653-c72807682175-public-tls-certs\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.231247 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7gfp\" (UniqueName: \"kubernetes.io/projected/38f1f660-5367-4db0-a653-c72807682175-kube-api-access-s7gfp\") pod \"barbican-api-5cdf7cd694-pv7bs\" (UID: \"38f1f660-5367-4db0-a653-c72807682175\") " pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.308754 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.714224 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e34c59d0-d2cd-41cd-990a-bed8a44d1230","Type":"ContainerStarted","Data":"9135ae164f48564b6f75b5cda2cb6334fd1486a2db5bf276d5a1b8decf2327ba"} Dec 16 15:16:06 crc kubenswrapper[4775]: I1216 15:16:06.777369 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5cdf7cd694-pv7bs"] Dec 16 15:16:07 crc kubenswrapper[4775]: I1216 15:16:07.728028 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cdf7cd694-pv7bs" event={"ID":"38f1f660-5367-4db0-a653-c72807682175","Type":"ContainerStarted","Data":"27126bf3973834257e388cc7829a6e198e0c0bda2cb6cc03739addd6bec30504"} Dec 16 15:16:07 crc kubenswrapper[4775]: I1216 15:16:07.728605 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cdf7cd694-pv7bs" event={"ID":"38f1f660-5367-4db0-a653-c72807682175","Type":"ContainerStarted","Data":"45c9e0b04f09ce86c0f9de9fe179d5605aec132ad0c865d73f9c1034e581e135"} Dec 16 15:16:07 crc kubenswrapper[4775]: I1216 15:16:07.728623 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cdf7cd694-pv7bs" event={"ID":"38f1f660-5367-4db0-a653-c72807682175","Type":"ContainerStarted","Data":"0ad10224383c7d82503c54fec05185d1b1355395fb8e4eeb76b9743c366c0b1a"} Dec 16 15:16:07 crc kubenswrapper[4775]: I1216 15:16:07.728674 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:07 crc kubenswrapper[4775]: I1216 15:16:07.728700 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:07 crc kubenswrapper[4775]: I1216 15:16:07.731281 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e34c59d0-d2cd-41cd-990a-bed8a44d1230","Type":"ContainerStarted","Data":"77597d037b67f44664f1f03a6708c36eb6395c877720c6ce60169d76023fa743"} Dec 16 15:16:07 crc kubenswrapper[4775]: I1216 15:16:07.733825 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b12d393-83d8-4db9-8aee-517105ff8484","Type":"ContainerStarted","Data":"6b7187048db9b4d234ae68b44be6c47b31e1b8bd37d0847e216348740a8d3729"} Dec 16 15:16:07 crc kubenswrapper[4775]: I1216 15:16:07.760322 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5cdf7cd694-pv7bs" podStartSLOduration=2.760303063 podStartE2EDuration="2.760303063s" podCreationTimestamp="2025-12-16 15:16:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:07.754107807 +0000 UTC m=+1292.705186740" watchObservedRunningTime="2025-12-16 15:16:07.760303063 +0000 UTC m=+1292.711381986" Dec 16 15:16:07 crc kubenswrapper[4775]: I1216 15:16:07.783186 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.365848109 podStartE2EDuration="8.783163074s" podCreationTimestamp="2025-12-16 15:15:59 +0000 UTC" firstStartedPulling="2025-12-16 15:16:00.712364734 +0000 UTC m=+1285.663443647" lastFinishedPulling="2025-12-16 15:16:04.129679689 +0000 UTC m=+1289.080758612" observedRunningTime="2025-12-16 15:16:07.777501906 +0000 UTC m=+1292.728580839" watchObservedRunningTime="2025-12-16 15:16:07.783163074 +0000 UTC m=+1292.734241997" Dec 16 15:16:09 crc kubenswrapper[4775]: I1216 15:16:09.852396 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:16:10 crc kubenswrapper[4775]: I1216 15:16:10.066906 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 16 15:16:10 crc kubenswrapper[4775]: I1216 15:16:10.396049 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:10 crc kubenswrapper[4775]: I1216 15:16:10.478439 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t22wn"] Dec 16 15:16:10 crc kubenswrapper[4775]: I1216 15:16:10.482557 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-t22wn" podUID="864f3a02-d697-4a42-b8fc-2aafd912bc62" containerName="dnsmasq-dns" containerID="cri-o://8ea4ab6cce773d83c4f4ad66fa77cb5adb6b1bebe94e845f00036470cfaa6533" gracePeriod=10 Dec 16 15:16:10 crc kubenswrapper[4775]: I1216 15:16:10.790316 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b12d393-83d8-4db9-8aee-517105ff8484","Type":"ContainerStarted","Data":"9b3e98d922ff0030997f7c093c8c5083a66dad7204d5b26869dbc5e97f4249e8"} Dec 16 15:16:10 crc kubenswrapper[4775]: I1216 15:16:10.799506 4775 generic.go:334] "Generic (PLEG): container finished" podID="864f3a02-d697-4a42-b8fc-2aafd912bc62" containerID="8ea4ab6cce773d83c4f4ad66fa77cb5adb6b1bebe94e845f00036470cfaa6533" exitCode=0 Dec 16 15:16:10 crc kubenswrapper[4775]: I1216 15:16:10.799580 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t22wn" event={"ID":"864f3a02-d697-4a42-b8fc-2aafd912bc62","Type":"ContainerDied","Data":"8ea4ab6cce773d83c4f4ad66fa77cb5adb6b1bebe94e845f00036470cfaa6533"} Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.091426 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.225369 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-dns-svc\") pod \"864f3a02-d697-4a42-b8fc-2aafd912bc62\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.228336 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hr84\" (UniqueName: \"kubernetes.io/projected/864f3a02-d697-4a42-b8fc-2aafd912bc62-kube-api-access-5hr84\") pod \"864f3a02-d697-4a42-b8fc-2aafd912bc62\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.228566 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-ovsdbserver-sb\") pod \"864f3a02-d697-4a42-b8fc-2aafd912bc62\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.228781 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-dns-swift-storage-0\") pod \"864f3a02-d697-4a42-b8fc-2aafd912bc62\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.229123 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-ovsdbserver-nb\") pod \"864f3a02-d697-4a42-b8fc-2aafd912bc62\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.229248 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-config\") pod \"864f3a02-d697-4a42-b8fc-2aafd912bc62\" (UID: \"864f3a02-d697-4a42-b8fc-2aafd912bc62\") " Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.251257 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/864f3a02-d697-4a42-b8fc-2aafd912bc62-kube-api-access-5hr84" (OuterVolumeSpecName: "kube-api-access-5hr84") pod "864f3a02-d697-4a42-b8fc-2aafd912bc62" (UID: "864f3a02-d697-4a42-b8fc-2aafd912bc62"). InnerVolumeSpecName "kube-api-access-5hr84". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.297019 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "864f3a02-d697-4a42-b8fc-2aafd912bc62" (UID: "864f3a02-d697-4a42-b8fc-2aafd912bc62"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.305744 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-config" (OuterVolumeSpecName: "config") pod "864f3a02-d697-4a42-b8fc-2aafd912bc62" (UID: "864f3a02-d697-4a42-b8fc-2aafd912bc62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.311355 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "864f3a02-d697-4a42-b8fc-2aafd912bc62" (UID: "864f3a02-d697-4a42-b8fc-2aafd912bc62"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.319150 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "864f3a02-d697-4a42-b8fc-2aafd912bc62" (UID: "864f3a02-d697-4a42-b8fc-2aafd912bc62"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.327808 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "864f3a02-d697-4a42-b8fc-2aafd912bc62" (UID: "864f3a02-d697-4a42-b8fc-2aafd912bc62"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.332585 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.332648 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.332665 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.332679 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hr84\" (UniqueName: \"kubernetes.io/projected/864f3a02-d697-4a42-b8fc-2aafd912bc62-kube-api-access-5hr84\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.332696 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.332707 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/864f3a02-d697-4a42-b8fc-2aafd912bc62-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.820321 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t22wn" event={"ID":"864f3a02-d697-4a42-b8fc-2aafd912bc62","Type":"ContainerDied","Data":"cf12d723ec36a01fe624e42171f675c696a13e5573ade9e5f6b3d86ea333ce43"} Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.821100 4775 scope.go:117] "RemoveContainer" containerID="8ea4ab6cce773d83c4f4ad66fa77cb5adb6b1bebe94e845f00036470cfaa6533" Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.820389 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t22wn" Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.845589 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t22wn"] Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.854746 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t22wn"] Dec 16 15:16:11 crc kubenswrapper[4775]: I1216 15:16:11.989801 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:16:12 crc kubenswrapper[4775]: I1216 15:16:12.004023 4775 scope.go:117] "RemoveContainer" containerID="ef0b12ddd3a5b7c0b070733b0d227a77dca5c57ab14c432c934bba2ad989d93f" Dec 16 15:16:12 crc kubenswrapper[4775]: I1216 15:16:12.065475 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:16:12 crc kubenswrapper[4775]: I1216 15:16:12.835166 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-69cbb5df9f-wmvhj" Dec 16 15:16:12 crc kubenswrapper[4775]: I1216 15:16:12.886919 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-69d967f7b4-xcrpt"] Dec 16 15:16:12 crc kubenswrapper[4775]: I1216 15:16:12.887185 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-69d967f7b4-xcrpt" podUID="62ce6699-2bf8-4133-ae72-6d91903df144" containerName="neutron-api" containerID="cri-o://168eba86b02448067af48492642ba60e548ea855e67f9cc16dc59fcc0259b829" gracePeriod=30 Dec 16 15:16:12 crc kubenswrapper[4775]: I1216 15:16:12.887245 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-69d967f7b4-xcrpt" podUID="62ce6699-2bf8-4133-ae72-6d91903df144" containerName="neutron-httpd" containerID="cri-o://34cd39d5a48688c95c58599921f90d7909125613d4905cb3cf4ec80967141470" gracePeriod=30 Dec 16 15:16:13 crc kubenswrapper[4775]: I1216 15:16:13.142502 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:16:13 crc kubenswrapper[4775]: I1216 15:16:13.162243 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-c66959c54-2xm6x" Dec 16 15:16:13 crc kubenswrapper[4775]: I1216 15:16:13.375449 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="864f3a02-d697-4a42-b8fc-2aafd912bc62" path="/var/lib/kubelet/pods/864f3a02-d697-4a42-b8fc-2aafd912bc62/volumes" Dec 16 15:16:13 crc kubenswrapper[4775]: I1216 15:16:13.376471 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 16 15:16:13 crc kubenswrapper[4775]: I1216 15:16:13.449787 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:13 crc kubenswrapper[4775]: I1216 15:16:13.854612 4775 generic.go:334] "Generic (PLEG): container finished" podID="62ce6699-2bf8-4133-ae72-6d91903df144" containerID="34cd39d5a48688c95c58599921f90d7909125613d4905cb3cf4ec80967141470" exitCode=0 Dec 16 15:16:13 crc kubenswrapper[4775]: I1216 15:16:13.854756 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d967f7b4-xcrpt" event={"ID":"62ce6699-2bf8-4133-ae72-6d91903df144","Type":"ContainerDied","Data":"34cd39d5a48688c95c58599921f90d7909125613d4905cb3cf4ec80967141470"} Dec 16 15:16:13 crc kubenswrapper[4775]: I1216 15:16:13.863128 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b12d393-83d8-4db9-8aee-517105ff8484","Type":"ContainerStarted","Data":"b17246be99afe61b30e278dcf66d4e0e0e4995fd2071d7a46be4d8fba61e0353"} Dec 16 15:16:14 crc kubenswrapper[4775]: I1216 15:16:14.622518 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-779ff79b57-nb7bt" Dec 16 15:16:15 crc kubenswrapper[4775]: I1216 15:16:15.283653 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5cdf7cd694-pv7bs" Dec 16 15:16:15 crc kubenswrapper[4775]: I1216 15:16:15.362695 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-779fdf9558-tvqbl"] Dec 16 15:16:15 crc kubenswrapper[4775]: I1216 15:16:15.363215 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-779fdf9558-tvqbl" podUID="1a040578-d21e-448d-b653-15547567f335" containerName="barbican-api-log" containerID="cri-o://d7c8fcbc22d506c3d006fd714b4a2a0e7730446f5aecd05518e3ca018538c339" gracePeriod=30 Dec 16 15:16:15 crc kubenswrapper[4775]: I1216 15:16:15.363336 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-779fdf9558-tvqbl" podUID="1a040578-d21e-448d-b653-15547567f335" containerName="barbican-api" containerID="cri-o://25e112847f41c5698171e2bcc24b3845e7f4a7b459592f7a07e46c23b81ad450" gracePeriod=30 Dec 16 15:16:15 crc kubenswrapper[4775]: I1216 15:16:15.373564 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-779fdf9558-tvqbl" podUID="1a040578-d21e-448d-b653-15547567f335" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": EOF" Dec 16 15:16:15 crc kubenswrapper[4775]: I1216 15:16:15.408364 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 16 15:16:15 crc kubenswrapper[4775]: I1216 15:16:15.468181 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 15:16:15 crc kubenswrapper[4775]: I1216 15:16:15.883269 4775 generic.go:334] "Generic (PLEG): container finished" podID="1a040578-d21e-448d-b653-15547567f335" containerID="d7c8fcbc22d506c3d006fd714b4a2a0e7730446f5aecd05518e3ca018538c339" exitCode=143 Dec 16 15:16:15 crc kubenswrapper[4775]: I1216 15:16:15.883324 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-779fdf9558-tvqbl" event={"ID":"1a040578-d21e-448d-b653-15547567f335","Type":"ContainerDied","Data":"d7c8fcbc22d506c3d006fd714b4a2a0e7730446f5aecd05518e3ca018538c339"} Dec 16 15:16:15 crc kubenswrapper[4775]: I1216 15:16:15.883880 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e34c59d0-d2cd-41cd-990a-bed8a44d1230" containerName="cinder-scheduler" containerID="cri-o://9135ae164f48564b6f75b5cda2cb6334fd1486a2db5bf276d5a1b8decf2327ba" gracePeriod=30 Dec 16 15:16:15 crc kubenswrapper[4775]: I1216 15:16:15.884059 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e34c59d0-d2cd-41cd-990a-bed8a44d1230" containerName="probe" containerID="cri-o://77597d037b67f44664f1f03a6708c36eb6395c877720c6ce60169d76023fa743" gracePeriod=30 Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.503720 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 16 15:16:16 crc kubenswrapper[4775]: E1216 15:16:16.504567 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="864f3a02-d697-4a42-b8fc-2aafd912bc62" containerName="init" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.504585 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="864f3a02-d697-4a42-b8fc-2aafd912bc62" containerName="init" Dec 16 15:16:16 crc kubenswrapper[4775]: E1216 15:16:16.504623 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="864f3a02-d697-4a42-b8fc-2aafd912bc62" containerName="dnsmasq-dns" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.504632 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="864f3a02-d697-4a42-b8fc-2aafd912bc62" containerName="dnsmasq-dns" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.504868 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="864f3a02-d697-4a42-b8fc-2aafd912bc62" containerName="dnsmasq-dns" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.505677 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.511149 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.511478 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.511620 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-5wh2q" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.548711 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.644253 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lfnl\" (UniqueName: \"kubernetes.io/projected/a9f81b8a-3b7e-4984-946f-2de17873b97a-kube-api-access-7lfnl\") pod \"openstackclient\" (UID: \"a9f81b8a-3b7e-4984-946f-2de17873b97a\") " pod="openstack/openstackclient" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.644334 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f81b8a-3b7e-4984-946f-2de17873b97a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a9f81b8a-3b7e-4984-946f-2de17873b97a\") " pod="openstack/openstackclient" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.644478 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a9f81b8a-3b7e-4984-946f-2de17873b97a-openstack-config-secret\") pod \"openstackclient\" (UID: \"a9f81b8a-3b7e-4984-946f-2de17873b97a\") " pod="openstack/openstackclient" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.644535 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a9f81b8a-3b7e-4984-946f-2de17873b97a-openstack-config\") pod \"openstackclient\" (UID: \"a9f81b8a-3b7e-4984-946f-2de17873b97a\") " pod="openstack/openstackclient" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.746717 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a9f81b8a-3b7e-4984-946f-2de17873b97a-openstack-config-secret\") pod \"openstackclient\" (UID: \"a9f81b8a-3b7e-4984-946f-2de17873b97a\") " pod="openstack/openstackclient" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.747150 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a9f81b8a-3b7e-4984-946f-2de17873b97a-openstack-config\") pod \"openstackclient\" (UID: \"a9f81b8a-3b7e-4984-946f-2de17873b97a\") " pod="openstack/openstackclient" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.747206 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lfnl\" (UniqueName: \"kubernetes.io/projected/a9f81b8a-3b7e-4984-946f-2de17873b97a-kube-api-access-7lfnl\") pod \"openstackclient\" (UID: \"a9f81b8a-3b7e-4984-946f-2de17873b97a\") " pod="openstack/openstackclient" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.747261 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f81b8a-3b7e-4984-946f-2de17873b97a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a9f81b8a-3b7e-4984-946f-2de17873b97a\") " pod="openstack/openstackclient" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.748315 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a9f81b8a-3b7e-4984-946f-2de17873b97a-openstack-config\") pod \"openstackclient\" (UID: \"a9f81b8a-3b7e-4984-946f-2de17873b97a\") " pod="openstack/openstackclient" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.753479 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a9f81b8a-3b7e-4984-946f-2de17873b97a-openstack-config-secret\") pod \"openstackclient\" (UID: \"a9f81b8a-3b7e-4984-946f-2de17873b97a\") " pod="openstack/openstackclient" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.763767 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f81b8a-3b7e-4984-946f-2de17873b97a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a9f81b8a-3b7e-4984-946f-2de17873b97a\") " pod="openstack/openstackclient" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.769199 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lfnl\" (UniqueName: \"kubernetes.io/projected/a9f81b8a-3b7e-4984-946f-2de17873b97a-kube-api-access-7lfnl\") pod \"openstackclient\" (UID: \"a9f81b8a-3b7e-4984-946f-2de17873b97a\") " pod="openstack/openstackclient" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.893939 4775 generic.go:334] "Generic (PLEG): container finished" podID="e34c59d0-d2cd-41cd-990a-bed8a44d1230" containerID="77597d037b67f44664f1f03a6708c36eb6395c877720c6ce60169d76023fa743" exitCode=0 Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.893992 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e34c59d0-d2cd-41cd-990a-bed8a44d1230","Type":"ContainerDied","Data":"77597d037b67f44664f1f03a6708c36eb6395c877720c6ce60169d76023fa743"} Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.896436 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b12d393-83d8-4db9-8aee-517105ff8484","Type":"ContainerStarted","Data":"79283e9e5c95601260fa349de4b8002d22b523e76312c76682c46ac799e6d7de"} Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.896661 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.928024 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.044903614 podStartE2EDuration="12.927999998s" podCreationTimestamp="2025-12-16 15:16:04 +0000 UTC" firstStartedPulling="2025-12-16 15:16:05.576835256 +0000 UTC m=+1290.527914179" lastFinishedPulling="2025-12-16 15:16:16.45993164 +0000 UTC m=+1301.411010563" observedRunningTime="2025-12-16 15:16:16.917406804 +0000 UTC m=+1301.868485747" watchObservedRunningTime="2025-12-16 15:16:16.927999998 +0000 UTC m=+1301.879078921" Dec 16 15:16:16 crc kubenswrapper[4775]: I1216 15:16:16.973596 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 15:16:17 crc kubenswrapper[4775]: I1216 15:16:17.516110 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 15:16:17 crc kubenswrapper[4775]: W1216 15:16:17.516185 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9f81b8a_3b7e_4984_946f_2de17873b97a.slice/crio-6d5581fa001f6e4aca0b390e1b41d4455d719c08e69b1b785421cd0d0084bbc2 WatchSource:0}: Error finding container 6d5581fa001f6e4aca0b390e1b41d4455d719c08e69b1b785421cd0d0084bbc2: Status 404 returned error can't find the container with id 6d5581fa001f6e4aca0b390e1b41d4455d719c08e69b1b785421cd0d0084bbc2 Dec 16 15:16:17 crc kubenswrapper[4775]: I1216 15:16:17.905433 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a9f81b8a-3b7e-4984-946f-2de17873b97a","Type":"ContainerStarted","Data":"6d5581fa001f6e4aca0b390e1b41d4455d719c08e69b1b785421cd0d0084bbc2"} Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.689607 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.694583 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.817029 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-config-data-custom\") pod \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.817107 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e34c59d0-d2cd-41cd-990a-bed8a44d1230-etc-machine-id\") pod \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.817172 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vf5c\" (UniqueName: \"kubernetes.io/projected/62ce6699-2bf8-4133-ae72-6d91903df144-kube-api-access-8vf5c\") pod \"62ce6699-2bf8-4133-ae72-6d91903df144\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.817202 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e34c59d0-d2cd-41cd-990a-bed8a44d1230-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e34c59d0-d2cd-41cd-990a-bed8a44d1230" (UID: "e34c59d0-d2cd-41cd-990a-bed8a44d1230"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.817221 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-httpd-config\") pod \"62ce6699-2bf8-4133-ae72-6d91903df144\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.817264 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-combined-ca-bundle\") pod \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.817312 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-scripts\") pod \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.817342 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-ovndb-tls-certs\") pod \"62ce6699-2bf8-4133-ae72-6d91903df144\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.817367 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-config\") pod \"62ce6699-2bf8-4133-ae72-6d91903df144\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.817426 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjxct\" (UniqueName: \"kubernetes.io/projected/e34c59d0-d2cd-41cd-990a-bed8a44d1230-kube-api-access-bjxct\") pod \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.817463 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-combined-ca-bundle\") pod \"62ce6699-2bf8-4133-ae72-6d91903df144\" (UID: \"62ce6699-2bf8-4133-ae72-6d91903df144\") " Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.817499 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-config-data\") pod \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\" (UID: \"e34c59d0-d2cd-41cd-990a-bed8a44d1230\") " Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.818146 4775 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e34c59d0-d2cd-41cd-990a-bed8a44d1230-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.828049 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-scripts" (OuterVolumeSpecName: "scripts") pod "e34c59d0-d2cd-41cd-990a-bed8a44d1230" (UID: "e34c59d0-d2cd-41cd-990a-bed8a44d1230"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.828122 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e34c59d0-d2cd-41cd-990a-bed8a44d1230-kube-api-access-bjxct" (OuterVolumeSpecName: "kube-api-access-bjxct") pod "e34c59d0-d2cd-41cd-990a-bed8a44d1230" (UID: "e34c59d0-d2cd-41cd-990a-bed8a44d1230"). InnerVolumeSpecName "kube-api-access-bjxct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.828253 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e34c59d0-d2cd-41cd-990a-bed8a44d1230" (UID: "e34c59d0-d2cd-41cd-990a-bed8a44d1230"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.828806 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "62ce6699-2bf8-4133-ae72-6d91903df144" (UID: "62ce6699-2bf8-4133-ae72-6d91903df144"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.829034 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62ce6699-2bf8-4133-ae72-6d91903df144-kube-api-access-8vf5c" (OuterVolumeSpecName: "kube-api-access-8vf5c") pod "62ce6699-2bf8-4133-ae72-6d91903df144" (UID: "62ce6699-2bf8-4133-ae72-6d91903df144"). InnerVolumeSpecName "kube-api-access-8vf5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.843143 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-779fdf9558-tvqbl" podUID="1a040578-d21e-448d-b653-15547567f335" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:59444->10.217.0.157:9311: read: connection reset by peer" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.846065 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-779fdf9558-tvqbl" podUID="1a040578-d21e-448d-b653-15547567f335" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:59430->10.217.0.157:9311: read: connection reset by peer" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.877844 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62ce6699-2bf8-4133-ae72-6d91903df144" (UID: "62ce6699-2bf8-4133-ae72-6d91903df144"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.895248 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-config" (OuterVolumeSpecName: "config") pod "62ce6699-2bf8-4133-ae72-6d91903df144" (UID: "62ce6699-2bf8-4133-ae72-6d91903df144"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.909239 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e34c59d0-d2cd-41cd-990a-bed8a44d1230" (UID: "e34c59d0-d2cd-41cd-990a-bed8a44d1230"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.920187 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vf5c\" (UniqueName: \"kubernetes.io/projected/62ce6699-2bf8-4133-ae72-6d91903df144-kube-api-access-8vf5c\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.920699 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.920781 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.920974 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.921051 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.921105 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjxct\" (UniqueName: \"kubernetes.io/projected/e34c59d0-d2cd-41cd-990a-bed8a44d1230-kube-api-access-bjxct\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.921157 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.921209 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.938610 4775 generic.go:334] "Generic (PLEG): container finished" podID="e34c59d0-d2cd-41cd-990a-bed8a44d1230" containerID="9135ae164f48564b6f75b5cda2cb6334fd1486a2db5bf276d5a1b8decf2327ba" exitCode=0 Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.938680 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.938680 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e34c59d0-d2cd-41cd-990a-bed8a44d1230","Type":"ContainerDied","Data":"9135ae164f48564b6f75b5cda2cb6334fd1486a2db5bf276d5a1b8decf2327ba"} Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.938766 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e34c59d0-d2cd-41cd-990a-bed8a44d1230","Type":"ContainerDied","Data":"2bb4272982f64bfbeef5c9e9895ae32850810f1c7df7ba3c935ca07209a31486"} Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.938784 4775 scope.go:117] "RemoveContainer" containerID="77597d037b67f44664f1f03a6708c36eb6395c877720c6ce60169d76023fa743" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.939260 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "62ce6699-2bf8-4133-ae72-6d91903df144" (UID: "62ce6699-2bf8-4133-ae72-6d91903df144"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.943537 4775 generic.go:334] "Generic (PLEG): container finished" podID="1a040578-d21e-448d-b653-15547567f335" containerID="25e112847f41c5698171e2bcc24b3845e7f4a7b459592f7a07e46c23b81ad450" exitCode=0 Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.943631 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-779fdf9558-tvqbl" event={"ID":"1a040578-d21e-448d-b653-15547567f335","Type":"ContainerDied","Data":"25e112847f41c5698171e2bcc24b3845e7f4a7b459592f7a07e46c23b81ad450"} Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.949436 4775 generic.go:334] "Generic (PLEG): container finished" podID="62ce6699-2bf8-4133-ae72-6d91903df144" containerID="168eba86b02448067af48492642ba60e548ea855e67f9cc16dc59fcc0259b829" exitCode=0 Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.949484 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d967f7b4-xcrpt" event={"ID":"62ce6699-2bf8-4133-ae72-6d91903df144","Type":"ContainerDied","Data":"168eba86b02448067af48492642ba60e548ea855e67f9cc16dc59fcc0259b829"} Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.949519 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d967f7b4-xcrpt" event={"ID":"62ce6699-2bf8-4133-ae72-6d91903df144","Type":"ContainerDied","Data":"a45e1f82d4328081e0ceefa89bd62f300f020a6f07d27965485858b80447d6fe"} Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.949590 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69d967f7b4-xcrpt" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.956168 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-config-data" (OuterVolumeSpecName: "config-data") pod "e34c59d0-d2cd-41cd-990a-bed8a44d1230" (UID: "e34c59d0-d2cd-41cd-990a-bed8a44d1230"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.966117 4775 scope.go:117] "RemoveContainer" containerID="9135ae164f48564b6f75b5cda2cb6334fd1486a2db5bf276d5a1b8decf2327ba" Dec 16 15:16:19 crc kubenswrapper[4775]: I1216 15:16:19.999644 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-69d967f7b4-xcrpt"] Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.026006 4775 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62ce6699-2bf8-4133-ae72-6d91903df144-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.026037 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e34c59d0-d2cd-41cd-990a-bed8a44d1230-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.026722 4775 scope.go:117] "RemoveContainer" containerID="77597d037b67f44664f1f03a6708c36eb6395c877720c6ce60169d76023fa743" Dec 16 15:16:20 crc kubenswrapper[4775]: E1216 15:16:20.028094 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77597d037b67f44664f1f03a6708c36eb6395c877720c6ce60169d76023fa743\": container with ID starting with 77597d037b67f44664f1f03a6708c36eb6395c877720c6ce60169d76023fa743 not found: ID does not exist" containerID="77597d037b67f44664f1f03a6708c36eb6395c877720c6ce60169d76023fa743" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.028134 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77597d037b67f44664f1f03a6708c36eb6395c877720c6ce60169d76023fa743"} err="failed to get container status \"77597d037b67f44664f1f03a6708c36eb6395c877720c6ce60169d76023fa743\": rpc error: code = NotFound desc = could not find container \"77597d037b67f44664f1f03a6708c36eb6395c877720c6ce60169d76023fa743\": container with ID starting with 77597d037b67f44664f1f03a6708c36eb6395c877720c6ce60169d76023fa743 not found: ID does not exist" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.028157 4775 scope.go:117] "RemoveContainer" containerID="9135ae164f48564b6f75b5cda2cb6334fd1486a2db5bf276d5a1b8decf2327ba" Dec 16 15:16:20 crc kubenswrapper[4775]: E1216 15:16:20.028384 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9135ae164f48564b6f75b5cda2cb6334fd1486a2db5bf276d5a1b8decf2327ba\": container with ID starting with 9135ae164f48564b6f75b5cda2cb6334fd1486a2db5bf276d5a1b8decf2327ba not found: ID does not exist" containerID="9135ae164f48564b6f75b5cda2cb6334fd1486a2db5bf276d5a1b8decf2327ba" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.028407 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9135ae164f48564b6f75b5cda2cb6334fd1486a2db5bf276d5a1b8decf2327ba"} err="failed to get container status \"9135ae164f48564b6f75b5cda2cb6334fd1486a2db5bf276d5a1b8decf2327ba\": rpc error: code = NotFound desc = could not find container \"9135ae164f48564b6f75b5cda2cb6334fd1486a2db5bf276d5a1b8decf2327ba\": container with ID starting with 9135ae164f48564b6f75b5cda2cb6334fd1486a2db5bf276d5a1b8decf2327ba not found: ID does not exist" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.028450 4775 scope.go:117] "RemoveContainer" containerID="34cd39d5a48688c95c58599921f90d7909125613d4905cb3cf4ec80967141470" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.035652 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-69d967f7b4-xcrpt"] Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.106041 4775 scope.go:117] "RemoveContainer" containerID="168eba86b02448067af48492642ba60e548ea855e67f9cc16dc59fcc0259b829" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.150461 4775 scope.go:117] "RemoveContainer" containerID="34cd39d5a48688c95c58599921f90d7909125613d4905cb3cf4ec80967141470" Dec 16 15:16:20 crc kubenswrapper[4775]: E1216 15:16:20.151184 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34cd39d5a48688c95c58599921f90d7909125613d4905cb3cf4ec80967141470\": container with ID starting with 34cd39d5a48688c95c58599921f90d7909125613d4905cb3cf4ec80967141470 not found: ID does not exist" containerID="34cd39d5a48688c95c58599921f90d7909125613d4905cb3cf4ec80967141470" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.151210 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34cd39d5a48688c95c58599921f90d7909125613d4905cb3cf4ec80967141470"} err="failed to get container status \"34cd39d5a48688c95c58599921f90d7909125613d4905cb3cf4ec80967141470\": rpc error: code = NotFound desc = could not find container \"34cd39d5a48688c95c58599921f90d7909125613d4905cb3cf4ec80967141470\": container with ID starting with 34cd39d5a48688c95c58599921f90d7909125613d4905cb3cf4ec80967141470 not found: ID does not exist" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.151232 4775 scope.go:117] "RemoveContainer" containerID="168eba86b02448067af48492642ba60e548ea855e67f9cc16dc59fcc0259b829" Dec 16 15:16:20 crc kubenswrapper[4775]: E1216 15:16:20.151418 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"168eba86b02448067af48492642ba60e548ea855e67f9cc16dc59fcc0259b829\": container with ID starting with 168eba86b02448067af48492642ba60e548ea855e67f9cc16dc59fcc0259b829 not found: ID does not exist" containerID="168eba86b02448067af48492642ba60e548ea855e67f9cc16dc59fcc0259b829" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.151437 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168eba86b02448067af48492642ba60e548ea855e67f9cc16dc59fcc0259b829"} err="failed to get container status \"168eba86b02448067af48492642ba60e548ea855e67f9cc16dc59fcc0259b829\": rpc error: code = NotFound desc = could not find container \"168eba86b02448067af48492642ba60e548ea855e67f9cc16dc59fcc0259b829\": container with ID starting with 168eba86b02448067af48492642ba60e548ea855e67f9cc16dc59fcc0259b829 not found: ID does not exist" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.288774 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.329462 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.341833 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 15:16:20 crc kubenswrapper[4775]: E1216 15:16:20.342311 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ce6699-2bf8-4133-ae72-6d91903df144" containerName="neutron-api" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.342328 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ce6699-2bf8-4133-ae72-6d91903df144" containerName="neutron-api" Dec 16 15:16:20 crc kubenswrapper[4775]: E1216 15:16:20.342341 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e34c59d0-d2cd-41cd-990a-bed8a44d1230" containerName="probe" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.342348 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e34c59d0-d2cd-41cd-990a-bed8a44d1230" containerName="probe" Dec 16 15:16:20 crc kubenswrapper[4775]: E1216 15:16:20.342371 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ce6699-2bf8-4133-ae72-6d91903df144" containerName="neutron-httpd" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.342380 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ce6699-2bf8-4133-ae72-6d91903df144" containerName="neutron-httpd" Dec 16 15:16:20 crc kubenswrapper[4775]: E1216 15:16:20.342401 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e34c59d0-d2cd-41cd-990a-bed8a44d1230" containerName="cinder-scheduler" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.342408 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e34c59d0-d2cd-41cd-990a-bed8a44d1230" containerName="cinder-scheduler" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.342582 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="62ce6699-2bf8-4133-ae72-6d91903df144" containerName="neutron-httpd" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.342596 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="62ce6699-2bf8-4133-ae72-6d91903df144" containerName="neutron-api" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.342606 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e34c59d0-d2cd-41cd-990a-bed8a44d1230" containerName="cinder-scheduler" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.342618 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e34c59d0-d2cd-41cd-990a-bed8a44d1230" containerName="probe" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.343630 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.346055 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.354988 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.367755 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.435109 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw9hx\" (UniqueName: \"kubernetes.io/projected/fc19d8c3-9cbc-45db-ad19-ab8a38792218-kube-api-access-cw9hx\") pod \"cinder-scheduler-0\" (UID: \"fc19d8c3-9cbc-45db-ad19-ab8a38792218\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.435157 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc19d8c3-9cbc-45db-ad19-ab8a38792218-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fc19d8c3-9cbc-45db-ad19-ab8a38792218\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.435178 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc19d8c3-9cbc-45db-ad19-ab8a38792218-scripts\") pod \"cinder-scheduler-0\" (UID: \"fc19d8c3-9cbc-45db-ad19-ab8a38792218\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.435300 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc19d8c3-9cbc-45db-ad19-ab8a38792218-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fc19d8c3-9cbc-45db-ad19-ab8a38792218\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.435341 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc19d8c3-9cbc-45db-ad19-ab8a38792218-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fc19d8c3-9cbc-45db-ad19-ab8a38792218\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.435392 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc19d8c3-9cbc-45db-ad19-ab8a38792218-config-data\") pod \"cinder-scheduler-0\" (UID: \"fc19d8c3-9cbc-45db-ad19-ab8a38792218\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.536513 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a040578-d21e-448d-b653-15547567f335-logs\") pod \"1a040578-d21e-448d-b653-15547567f335\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.536620 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mkh6\" (UniqueName: \"kubernetes.io/projected/1a040578-d21e-448d-b653-15547567f335-kube-api-access-2mkh6\") pod \"1a040578-d21e-448d-b653-15547567f335\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.536689 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a040578-d21e-448d-b653-15547567f335-combined-ca-bundle\") pod \"1a040578-d21e-448d-b653-15547567f335\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.536734 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a040578-d21e-448d-b653-15547567f335-config-data\") pod \"1a040578-d21e-448d-b653-15547567f335\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.536925 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a040578-d21e-448d-b653-15547567f335-config-data-custom\") pod \"1a040578-d21e-448d-b653-15547567f335\" (UID: \"1a040578-d21e-448d-b653-15547567f335\") " Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.537180 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw9hx\" (UniqueName: \"kubernetes.io/projected/fc19d8c3-9cbc-45db-ad19-ab8a38792218-kube-api-access-cw9hx\") pod \"cinder-scheduler-0\" (UID: \"fc19d8c3-9cbc-45db-ad19-ab8a38792218\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.537203 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc19d8c3-9cbc-45db-ad19-ab8a38792218-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fc19d8c3-9cbc-45db-ad19-ab8a38792218\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.537213 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a040578-d21e-448d-b653-15547567f335-logs" (OuterVolumeSpecName: "logs") pod "1a040578-d21e-448d-b653-15547567f335" (UID: "1a040578-d21e-448d-b653-15547567f335"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.537225 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc19d8c3-9cbc-45db-ad19-ab8a38792218-scripts\") pod \"cinder-scheduler-0\" (UID: \"fc19d8c3-9cbc-45db-ad19-ab8a38792218\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.537342 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc19d8c3-9cbc-45db-ad19-ab8a38792218-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fc19d8c3-9cbc-45db-ad19-ab8a38792218\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.537383 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc19d8c3-9cbc-45db-ad19-ab8a38792218-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fc19d8c3-9cbc-45db-ad19-ab8a38792218\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.537393 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc19d8c3-9cbc-45db-ad19-ab8a38792218-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fc19d8c3-9cbc-45db-ad19-ab8a38792218\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.537416 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc19d8c3-9cbc-45db-ad19-ab8a38792218-config-data\") pod \"cinder-scheduler-0\" (UID: \"fc19d8c3-9cbc-45db-ad19-ab8a38792218\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.546705 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc19d8c3-9cbc-45db-ad19-ab8a38792218-scripts\") pod \"cinder-scheduler-0\" (UID: \"fc19d8c3-9cbc-45db-ad19-ab8a38792218\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.548691 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc19d8c3-9cbc-45db-ad19-ab8a38792218-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fc19d8c3-9cbc-45db-ad19-ab8a38792218\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.553714 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a040578-d21e-448d-b653-15547567f335-kube-api-access-2mkh6" (OuterVolumeSpecName: "kube-api-access-2mkh6") pod "1a040578-d21e-448d-b653-15547567f335" (UID: "1a040578-d21e-448d-b653-15547567f335"). InnerVolumeSpecName "kube-api-access-2mkh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.554335 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc19d8c3-9cbc-45db-ad19-ab8a38792218-config-data\") pod \"cinder-scheduler-0\" (UID: \"fc19d8c3-9cbc-45db-ad19-ab8a38792218\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.557782 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw9hx\" (UniqueName: \"kubernetes.io/projected/fc19d8c3-9cbc-45db-ad19-ab8a38792218-kube-api-access-cw9hx\") pod \"cinder-scheduler-0\" (UID: \"fc19d8c3-9cbc-45db-ad19-ab8a38792218\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.563135 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc19d8c3-9cbc-45db-ad19-ab8a38792218-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fc19d8c3-9cbc-45db-ad19-ab8a38792218\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.570071 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a040578-d21e-448d-b653-15547567f335-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1a040578-d21e-448d-b653-15547567f335" (UID: "1a040578-d21e-448d-b653-15547567f335"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.590423 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a040578-d21e-448d-b653-15547567f335-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a040578-d21e-448d-b653-15547567f335" (UID: "1a040578-d21e-448d-b653-15547567f335"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.613501 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a040578-d21e-448d-b653-15547567f335-config-data" (OuterVolumeSpecName: "config-data") pod "1a040578-d21e-448d-b653-15547567f335" (UID: "1a040578-d21e-448d-b653-15547567f335"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.639292 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a040578-d21e-448d-b653-15547567f335-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.639371 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a040578-d21e-448d-b653-15547567f335-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.639388 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mkh6\" (UniqueName: \"kubernetes.io/projected/1a040578-d21e-448d-b653-15547567f335-kube-api-access-2mkh6\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.639406 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a040578-d21e-448d-b653-15547567f335-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.639417 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a040578-d21e-448d-b653-15547567f335-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.683250 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.959142 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-779fdf9558-tvqbl" event={"ID":"1a040578-d21e-448d-b653-15547567f335","Type":"ContainerDied","Data":"1d42bdc28fb0a15a30167e836b7ce0e6be45f53b1afcb0919543d4b653e2bfa7"} Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.959214 4775 scope.go:117] "RemoveContainer" containerID="25e112847f41c5698171e2bcc24b3845e7f4a7b459592f7a07e46c23b81ad450" Dec 16 15:16:20 crc kubenswrapper[4775]: I1216 15:16:20.959308 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-779fdf9558-tvqbl" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.007358 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-779fdf9558-tvqbl"] Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.013061 4775 scope.go:117] "RemoveContainer" containerID="d7c8fcbc22d506c3d006fd714b4a2a0e7730446f5aecd05518e3ca018538c339" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.015757 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-779fdf9558-tvqbl"] Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.166289 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-78c6fdf4b7-xxfgx"] Dec 16 15:16:21 crc kubenswrapper[4775]: E1216 15:16:21.166800 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a040578-d21e-448d-b653-15547567f335" containerName="barbican-api-log" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.166816 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a040578-d21e-448d-b653-15547567f335" containerName="barbican-api-log" Dec 16 15:16:21 crc kubenswrapper[4775]: E1216 15:16:21.166832 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a040578-d21e-448d-b653-15547567f335" containerName="barbican-api" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.166839 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a040578-d21e-448d-b653-15547567f335" containerName="barbican-api" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.167025 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a040578-d21e-448d-b653-15547567f335" containerName="barbican-api" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.167047 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a040578-d21e-448d-b653-15547567f335" containerName="barbican-api-log" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.168053 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.172863 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.173182 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.173499 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.178302 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.205525 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-78c6fdf4b7-xxfgx"] Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.254218 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15f3da25-9cb6-406e-b022-935c6201ea4a-run-httpd\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.254318 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f3da25-9cb6-406e-b022-935c6201ea4a-config-data\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.254345 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f3da25-9cb6-406e-b022-935c6201ea4a-public-tls-certs\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.254399 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15f3da25-9cb6-406e-b022-935c6201ea4a-etc-swift\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.254432 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f3da25-9cb6-406e-b022-935c6201ea4a-combined-ca-bundle\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.254488 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15f3da25-9cb6-406e-b022-935c6201ea4a-log-httpd\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.254518 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmplj\" (UniqueName: \"kubernetes.io/projected/15f3da25-9cb6-406e-b022-935c6201ea4a-kube-api-access-pmplj\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.254621 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f3da25-9cb6-406e-b022-935c6201ea4a-internal-tls-certs\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.355958 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15f3da25-9cb6-406e-b022-935c6201ea4a-run-httpd\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.356343 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f3da25-9cb6-406e-b022-935c6201ea4a-config-data\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.357194 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f3da25-9cb6-406e-b022-935c6201ea4a-public-tls-certs\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.357327 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15f3da25-9cb6-406e-b022-935c6201ea4a-etc-swift\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.357436 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f3da25-9cb6-406e-b022-935c6201ea4a-combined-ca-bundle\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.357525 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15f3da25-9cb6-406e-b022-935c6201ea4a-log-httpd\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.357612 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmplj\" (UniqueName: \"kubernetes.io/projected/15f3da25-9cb6-406e-b022-935c6201ea4a-kube-api-access-pmplj\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.357711 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f3da25-9cb6-406e-b022-935c6201ea4a-internal-tls-certs\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.356664 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15f3da25-9cb6-406e-b022-935c6201ea4a-run-httpd\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.359367 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15f3da25-9cb6-406e-b022-935c6201ea4a-log-httpd\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.362083 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f3da25-9cb6-406e-b022-935c6201ea4a-public-tls-certs\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.363991 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f3da25-9cb6-406e-b022-935c6201ea4a-internal-tls-certs\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.364611 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f3da25-9cb6-406e-b022-935c6201ea4a-config-data\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.364699 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15f3da25-9cb6-406e-b022-935c6201ea4a-etc-swift\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.370280 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a040578-d21e-448d-b653-15547567f335" path="/var/lib/kubelet/pods/1a040578-d21e-448d-b653-15547567f335/volumes" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.371265 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62ce6699-2bf8-4133-ae72-6d91903df144" path="/var/lib/kubelet/pods/62ce6699-2bf8-4133-ae72-6d91903df144/volumes" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.371966 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e34c59d0-d2cd-41cd-990a-bed8a44d1230" path="/var/lib/kubelet/pods/e34c59d0-d2cd-41cd-990a-bed8a44d1230/volumes" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.377694 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f3da25-9cb6-406e-b022-935c6201ea4a-combined-ca-bundle\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.380716 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmplj\" (UniqueName: \"kubernetes.io/projected/15f3da25-9cb6-406e-b022-935c6201ea4a-kube-api-access-pmplj\") pod \"swift-proxy-78c6fdf4b7-xxfgx\" (UID: \"15f3da25-9cb6-406e-b022-935c6201ea4a\") " pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:21 crc kubenswrapper[4775]: I1216 15:16:21.582533 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:22 crc kubenswrapper[4775]: I1216 15:16:22.002254 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc19d8c3-9cbc-45db-ad19-ab8a38792218","Type":"ContainerStarted","Data":"ee4ad1734ebe7b123f1eabd7a4a044c09b66614ff070b880f384c0cfe15e33dd"} Dec 16 15:16:22 crc kubenswrapper[4775]: I1216 15:16:22.002584 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc19d8c3-9cbc-45db-ad19-ab8a38792218","Type":"ContainerStarted","Data":"07c7e9342207039bbf1fd422463546aa70a40d8d335ef9c5c0de2826cde64eb4"} Dec 16 15:16:22 crc kubenswrapper[4775]: I1216 15:16:22.189339 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-78c6fdf4b7-xxfgx"] Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.055387 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fc19d8c3-9cbc-45db-ad19-ab8a38792218","Type":"ContainerStarted","Data":"8d5c798399728e654cec1d913b9c53356a93e29fa3ba30c477c801aa72112043"} Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.061503 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" event={"ID":"15f3da25-9cb6-406e-b022-935c6201ea4a","Type":"ContainerStarted","Data":"c728a46e827c63e12dadb696b66895059fcb2f962ad8e7d018fc48cd937d0c62"} Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.061566 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" event={"ID":"15f3da25-9cb6-406e-b022-935c6201ea4a","Type":"ContainerStarted","Data":"490a4d56c7973e551539de5745fc1ca2698af8753092bf3403c8ed293ba809dd"} Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.061581 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" event={"ID":"15f3da25-9cb6-406e-b022-935c6201ea4a","Type":"ContainerStarted","Data":"b753d8c3ca4e9c67ef2e14cef0f3ff2efacb0d8c1778c1cc730f05a8de3e4e3e"} Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.062866 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.063298 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.096016 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.095993225 podStartE2EDuration="3.095993225s" podCreationTimestamp="2025-12-16 15:16:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:23.08413926 +0000 UTC m=+1308.035218213" watchObservedRunningTime="2025-12-16 15:16:23.095993225 +0000 UTC m=+1308.047072148" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.121158 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" podStartSLOduration=2.121138678 podStartE2EDuration="2.121138678s" podCreationTimestamp="2025-12-16 15:16:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:23.106591779 +0000 UTC m=+1308.057670712" watchObservedRunningTime="2025-12-16 15:16:23.121138678 +0000 UTC m=+1308.072217601" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.637758 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-c49bc9464-wb445"] Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.639433 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-c49bc9464-wb445" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.643228 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-fspqq" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.643516 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.643824 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.660478 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-c49bc9464-wb445"] Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.719068 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clphq\" (UniqueName: \"kubernetes.io/projected/eb13cacc-e521-4220-a731-18136d35425c-kube-api-access-clphq\") pod \"heat-engine-c49bc9464-wb445\" (UID: \"eb13cacc-e521-4220-a731-18136d35425c\") " pod="openstack/heat-engine-c49bc9464-wb445" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.719435 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb13cacc-e521-4220-a731-18136d35425c-config-data-custom\") pod \"heat-engine-c49bc9464-wb445\" (UID: \"eb13cacc-e521-4220-a731-18136d35425c\") " pod="openstack/heat-engine-c49bc9464-wb445" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.719568 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb13cacc-e521-4220-a731-18136d35425c-combined-ca-bundle\") pod \"heat-engine-c49bc9464-wb445\" (UID: \"eb13cacc-e521-4220-a731-18136d35425c\") " pod="openstack/heat-engine-c49bc9464-wb445" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.719666 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb13cacc-e521-4220-a731-18136d35425c-config-data\") pod \"heat-engine-c49bc9464-wb445\" (UID: \"eb13cacc-e521-4220-a731-18136d35425c\") " pod="openstack/heat-engine-c49bc9464-wb445" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.766840 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-65646f4f55-j22ds"] Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.768517 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-65646f4f55-j22ds" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.771477 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.792244 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-p8zvh"] Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.794334 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.821383 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb13cacc-e521-4220-a731-18136d35425c-config-data\") pod \"heat-engine-c49bc9464-wb445\" (UID: \"eb13cacc-e521-4220-a731-18136d35425c\") " pod="openstack/heat-engine-c49bc9464-wb445" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.821677 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-p8zvh\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.821812 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-p8zvh\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.821928 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-p8zvh\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.822036 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghbsx\" (UniqueName: \"kubernetes.io/projected/799c9224-8212-452d-83c5-238ad4a6ed31-kube-api-access-ghbsx\") pod \"dnsmasq-dns-7756b9d78c-p8zvh\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.822137 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vwnd\" (UniqueName: \"kubernetes.io/projected/76beac71-bf66-45ec-8a1f-5f6ed8122888-kube-api-access-7vwnd\") pod \"heat-cfnapi-65646f4f55-j22ds\" (UID: \"76beac71-bf66-45ec-8a1f-5f6ed8122888\") " pod="openstack/heat-cfnapi-65646f4f55-j22ds" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.822236 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76beac71-bf66-45ec-8a1f-5f6ed8122888-combined-ca-bundle\") pod \"heat-cfnapi-65646f4f55-j22ds\" (UID: \"76beac71-bf66-45ec-8a1f-5f6ed8122888\") " pod="openstack/heat-cfnapi-65646f4f55-j22ds" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.822350 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-config\") pod \"dnsmasq-dns-7756b9d78c-p8zvh\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.822440 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76beac71-bf66-45ec-8a1f-5f6ed8122888-config-data\") pod \"heat-cfnapi-65646f4f55-j22ds\" (UID: \"76beac71-bf66-45ec-8a1f-5f6ed8122888\") " pod="openstack/heat-cfnapi-65646f4f55-j22ds" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.822530 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clphq\" (UniqueName: \"kubernetes.io/projected/eb13cacc-e521-4220-a731-18136d35425c-kube-api-access-clphq\") pod \"heat-engine-c49bc9464-wb445\" (UID: \"eb13cacc-e521-4220-a731-18136d35425c\") " pod="openstack/heat-engine-c49bc9464-wb445" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.822614 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76beac71-bf66-45ec-8a1f-5f6ed8122888-config-data-custom\") pod \"heat-cfnapi-65646f4f55-j22ds\" (UID: \"76beac71-bf66-45ec-8a1f-5f6ed8122888\") " pod="openstack/heat-cfnapi-65646f4f55-j22ds" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.822720 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb13cacc-e521-4220-a731-18136d35425c-config-data-custom\") pod \"heat-engine-c49bc9464-wb445\" (UID: \"eb13cacc-e521-4220-a731-18136d35425c\") " pod="openstack/heat-engine-c49bc9464-wb445" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.822817 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-p8zvh\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.822918 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb13cacc-e521-4220-a731-18136d35425c-combined-ca-bundle\") pod \"heat-engine-c49bc9464-wb445\" (UID: \"eb13cacc-e521-4220-a731-18136d35425c\") " pod="openstack/heat-engine-c49bc9464-wb445" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.831233 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-65646f4f55-j22ds"] Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.839391 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb13cacc-e521-4220-a731-18136d35425c-config-data\") pod \"heat-engine-c49bc9464-wb445\" (UID: \"eb13cacc-e521-4220-a731-18136d35425c\") " pod="openstack/heat-engine-c49bc9464-wb445" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.849790 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb13cacc-e521-4220-a731-18136d35425c-config-data-custom\") pod \"heat-engine-c49bc9464-wb445\" (UID: \"eb13cacc-e521-4220-a731-18136d35425c\") " pod="openstack/heat-engine-c49bc9464-wb445" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.859731 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb13cacc-e521-4220-a731-18136d35425c-combined-ca-bundle\") pod \"heat-engine-c49bc9464-wb445\" (UID: \"eb13cacc-e521-4220-a731-18136d35425c\") " pod="openstack/heat-engine-c49bc9464-wb445" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.876197 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-p8zvh"] Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.882609 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clphq\" (UniqueName: \"kubernetes.io/projected/eb13cacc-e521-4220-a731-18136d35425c-kube-api-access-clphq\") pod \"heat-engine-c49bc9464-wb445\" (UID: \"eb13cacc-e521-4220-a731-18136d35425c\") " pod="openstack/heat-engine-c49bc9464-wb445" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.918293 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-78c456ddf7-dbpj2"] Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.925069 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-78c456ddf7-dbpj2" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.927318 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-p8zvh\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.927361 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-p8zvh\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.927407 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghbsx\" (UniqueName: \"kubernetes.io/projected/799c9224-8212-452d-83c5-238ad4a6ed31-kube-api-access-ghbsx\") pod \"dnsmasq-dns-7756b9d78c-p8zvh\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.927444 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vwnd\" (UniqueName: \"kubernetes.io/projected/76beac71-bf66-45ec-8a1f-5f6ed8122888-kube-api-access-7vwnd\") pod \"heat-cfnapi-65646f4f55-j22ds\" (UID: \"76beac71-bf66-45ec-8a1f-5f6ed8122888\") " pod="openstack/heat-cfnapi-65646f4f55-j22ds" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.927470 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76beac71-bf66-45ec-8a1f-5f6ed8122888-combined-ca-bundle\") pod \"heat-cfnapi-65646f4f55-j22ds\" (UID: \"76beac71-bf66-45ec-8a1f-5f6ed8122888\") " pod="openstack/heat-cfnapi-65646f4f55-j22ds" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.927520 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-config\") pod \"dnsmasq-dns-7756b9d78c-p8zvh\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.927550 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76beac71-bf66-45ec-8a1f-5f6ed8122888-config-data\") pod \"heat-cfnapi-65646f4f55-j22ds\" (UID: \"76beac71-bf66-45ec-8a1f-5f6ed8122888\") " pod="openstack/heat-cfnapi-65646f4f55-j22ds" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.927581 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76beac71-bf66-45ec-8a1f-5f6ed8122888-config-data-custom\") pod \"heat-cfnapi-65646f4f55-j22ds\" (UID: \"76beac71-bf66-45ec-8a1f-5f6ed8122888\") " pod="openstack/heat-cfnapi-65646f4f55-j22ds" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.927630 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-p8zvh\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.927683 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-p8zvh\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.934744 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.936066 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-config\") pod \"dnsmasq-dns-7756b9d78c-p8zvh\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.938794 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-p8zvh\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.939192 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-p8zvh\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.939490 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-p8zvh\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.941566 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-p8zvh\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.945664 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76beac71-bf66-45ec-8a1f-5f6ed8122888-config-data-custom\") pod \"heat-cfnapi-65646f4f55-j22ds\" (UID: \"76beac71-bf66-45ec-8a1f-5f6ed8122888\") " pod="openstack/heat-cfnapi-65646f4f55-j22ds" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.956868 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76beac71-bf66-45ec-8a1f-5f6ed8122888-combined-ca-bundle\") pod \"heat-cfnapi-65646f4f55-j22ds\" (UID: \"76beac71-bf66-45ec-8a1f-5f6ed8122888\") " pod="openstack/heat-cfnapi-65646f4f55-j22ds" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.965369 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-c49bc9464-wb445" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.966950 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vwnd\" (UniqueName: \"kubernetes.io/projected/76beac71-bf66-45ec-8a1f-5f6ed8122888-kube-api-access-7vwnd\") pod \"heat-cfnapi-65646f4f55-j22ds\" (UID: \"76beac71-bf66-45ec-8a1f-5f6ed8122888\") " pod="openstack/heat-cfnapi-65646f4f55-j22ds" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.973075 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76beac71-bf66-45ec-8a1f-5f6ed8122888-config-data\") pod \"heat-cfnapi-65646f4f55-j22ds\" (UID: \"76beac71-bf66-45ec-8a1f-5f6ed8122888\") " pod="openstack/heat-cfnapi-65646f4f55-j22ds" Dec 16 15:16:23 crc kubenswrapper[4775]: I1216 15:16:23.973789 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-78c456ddf7-dbpj2"] Dec 16 15:16:24 crc kubenswrapper[4775]: I1216 15:16:24.032531 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42b8d55-b8c7-4982-9cef-714706199d4a-config-data\") pod \"heat-api-78c456ddf7-dbpj2\" (UID: \"e42b8d55-b8c7-4982-9cef-714706199d4a\") " pod="openstack/heat-api-78c456ddf7-dbpj2" Dec 16 15:16:24 crc kubenswrapper[4775]: I1216 15:16:24.032616 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwj5k\" (UniqueName: \"kubernetes.io/projected/e42b8d55-b8c7-4982-9cef-714706199d4a-kube-api-access-dwj5k\") pod \"heat-api-78c456ddf7-dbpj2\" (UID: \"e42b8d55-b8c7-4982-9cef-714706199d4a\") " pod="openstack/heat-api-78c456ddf7-dbpj2" Dec 16 15:16:24 crc kubenswrapper[4775]: I1216 15:16:24.032727 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e42b8d55-b8c7-4982-9cef-714706199d4a-config-data-custom\") pod \"heat-api-78c456ddf7-dbpj2\" (UID: \"e42b8d55-b8c7-4982-9cef-714706199d4a\") " pod="openstack/heat-api-78c456ddf7-dbpj2" Dec 16 15:16:24 crc kubenswrapper[4775]: I1216 15:16:24.032758 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42b8d55-b8c7-4982-9cef-714706199d4a-combined-ca-bundle\") pod \"heat-api-78c456ddf7-dbpj2\" (UID: \"e42b8d55-b8c7-4982-9cef-714706199d4a\") " pod="openstack/heat-api-78c456ddf7-dbpj2" Dec 16 15:16:24 crc kubenswrapper[4775]: I1216 15:16:24.116382 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-65646f4f55-j22ds" Dec 16 15:16:24 crc kubenswrapper[4775]: I1216 15:16:24.170172 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e42b8d55-b8c7-4982-9cef-714706199d4a-config-data-custom\") pod \"heat-api-78c456ddf7-dbpj2\" (UID: \"e42b8d55-b8c7-4982-9cef-714706199d4a\") " pod="openstack/heat-api-78c456ddf7-dbpj2" Dec 16 15:16:24 crc kubenswrapper[4775]: I1216 15:16:24.170243 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42b8d55-b8c7-4982-9cef-714706199d4a-combined-ca-bundle\") pod \"heat-api-78c456ddf7-dbpj2\" (UID: \"e42b8d55-b8c7-4982-9cef-714706199d4a\") " pod="openstack/heat-api-78c456ddf7-dbpj2" Dec 16 15:16:24 crc kubenswrapper[4775]: I1216 15:16:24.170357 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42b8d55-b8c7-4982-9cef-714706199d4a-config-data\") pod \"heat-api-78c456ddf7-dbpj2\" (UID: \"e42b8d55-b8c7-4982-9cef-714706199d4a\") " pod="openstack/heat-api-78c456ddf7-dbpj2" Dec 16 15:16:24 crc kubenswrapper[4775]: I1216 15:16:24.170397 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwj5k\" (UniqueName: \"kubernetes.io/projected/e42b8d55-b8c7-4982-9cef-714706199d4a-kube-api-access-dwj5k\") pod \"heat-api-78c456ddf7-dbpj2\" (UID: \"e42b8d55-b8c7-4982-9cef-714706199d4a\") " pod="openstack/heat-api-78c456ddf7-dbpj2" Dec 16 15:16:24 crc kubenswrapper[4775]: I1216 15:16:24.180734 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghbsx\" (UniqueName: \"kubernetes.io/projected/799c9224-8212-452d-83c5-238ad4a6ed31-kube-api-access-ghbsx\") pod \"dnsmasq-dns-7756b9d78c-p8zvh\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:24 crc kubenswrapper[4775]: I1216 15:16:24.197002 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42b8d55-b8c7-4982-9cef-714706199d4a-combined-ca-bundle\") pod \"heat-api-78c456ddf7-dbpj2\" (UID: \"e42b8d55-b8c7-4982-9cef-714706199d4a\") " pod="openstack/heat-api-78c456ddf7-dbpj2" Dec 16 15:16:24 crc kubenswrapper[4775]: I1216 15:16:24.197110 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e42b8d55-b8c7-4982-9cef-714706199d4a-config-data-custom\") pod \"heat-api-78c456ddf7-dbpj2\" (UID: \"e42b8d55-b8c7-4982-9cef-714706199d4a\") " pod="openstack/heat-api-78c456ddf7-dbpj2" Dec 16 15:16:24 crc kubenswrapper[4775]: I1216 15:16:24.198910 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42b8d55-b8c7-4982-9cef-714706199d4a-config-data\") pod \"heat-api-78c456ddf7-dbpj2\" (UID: \"e42b8d55-b8c7-4982-9cef-714706199d4a\") " pod="openstack/heat-api-78c456ddf7-dbpj2" Dec 16 15:16:24 crc kubenswrapper[4775]: I1216 15:16:24.239832 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwj5k\" (UniqueName: \"kubernetes.io/projected/e42b8d55-b8c7-4982-9cef-714706199d4a-kube-api-access-dwj5k\") pod \"heat-api-78c456ddf7-dbpj2\" (UID: \"e42b8d55-b8c7-4982-9cef-714706199d4a\") " pod="openstack/heat-api-78c456ddf7-dbpj2" Dec 16 15:16:24 crc kubenswrapper[4775]: I1216 15:16:24.260411 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:24 crc kubenswrapper[4775]: I1216 15:16:24.459028 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-78c456ddf7-dbpj2" Dec 16 15:16:24 crc kubenswrapper[4775]: I1216 15:16:24.954257 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-c49bc9464-wb445"] Dec 16 15:16:24 crc kubenswrapper[4775]: I1216 15:16:24.976994 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-p8zvh"] Dec 16 15:16:25 crc kubenswrapper[4775]: I1216 15:16:25.125052 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-c49bc9464-wb445" event={"ID":"eb13cacc-e521-4220-a731-18136d35425c","Type":"ContainerStarted","Data":"cd9c7e198c4dc902113bd14ecc7fdaca2b7993e14db35e20e96a667ba8f072d9"} Dec 16 15:16:25 crc kubenswrapper[4775]: I1216 15:16:25.163161 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" event={"ID":"799c9224-8212-452d-83c5-238ad4a6ed31","Type":"ContainerStarted","Data":"baee6b31a9b4d609a7154aded2bf57012de69a50776c41b81d31212166be1423"} Dec 16 15:16:25 crc kubenswrapper[4775]: I1216 15:16:25.238260 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-65646f4f55-j22ds"] Dec 16 15:16:25 crc kubenswrapper[4775]: I1216 15:16:25.290161 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-78c456ddf7-dbpj2"] Dec 16 15:16:25 crc kubenswrapper[4775]: I1216 15:16:25.479766 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:25 crc kubenswrapper[4775]: I1216 15:16:25.480106 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b12d393-83d8-4db9-8aee-517105ff8484" containerName="ceilometer-central-agent" containerID="cri-o://6b7187048db9b4d234ae68b44be6c47b31e1b8bd37d0847e216348740a8d3729" gracePeriod=30 Dec 16 15:16:25 crc kubenswrapper[4775]: I1216 15:16:25.480584 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b12d393-83d8-4db9-8aee-517105ff8484" containerName="proxy-httpd" containerID="cri-o://79283e9e5c95601260fa349de4b8002d22b523e76312c76682c46ac799e6d7de" gracePeriod=30 Dec 16 15:16:25 crc kubenswrapper[4775]: I1216 15:16:25.480669 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b12d393-83d8-4db9-8aee-517105ff8484" containerName="sg-core" containerID="cri-o://b17246be99afe61b30e278dcf66d4e0e0e4995fd2071d7a46be4d8fba61e0353" gracePeriod=30 Dec 16 15:16:25 crc kubenswrapper[4775]: I1216 15:16:25.480735 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b12d393-83d8-4db9-8aee-517105ff8484" containerName="ceilometer-notification-agent" containerID="cri-o://9b3e98d922ff0030997f7c093c8c5083a66dad7204d5b26869dbc5e97f4249e8" gracePeriod=30 Dec 16 15:16:25 crc kubenswrapper[4775]: I1216 15:16:25.684314 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 16 15:16:26 crc kubenswrapper[4775]: I1216 15:16:26.184209 4775 generic.go:334] "Generic (PLEG): container finished" podID="4b12d393-83d8-4db9-8aee-517105ff8484" containerID="79283e9e5c95601260fa349de4b8002d22b523e76312c76682c46ac799e6d7de" exitCode=0 Dec 16 15:16:26 crc kubenswrapper[4775]: I1216 15:16:26.184606 4775 generic.go:334] "Generic (PLEG): container finished" podID="4b12d393-83d8-4db9-8aee-517105ff8484" containerID="b17246be99afe61b30e278dcf66d4e0e0e4995fd2071d7a46be4d8fba61e0353" exitCode=2 Dec 16 15:16:26 crc kubenswrapper[4775]: I1216 15:16:26.184301 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b12d393-83d8-4db9-8aee-517105ff8484","Type":"ContainerDied","Data":"79283e9e5c95601260fa349de4b8002d22b523e76312c76682c46ac799e6d7de"} Dec 16 15:16:26 crc kubenswrapper[4775]: I1216 15:16:26.184681 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b12d393-83d8-4db9-8aee-517105ff8484","Type":"ContainerDied","Data":"b17246be99afe61b30e278dcf66d4e0e0e4995fd2071d7a46be4d8fba61e0353"} Dec 16 15:16:26 crc kubenswrapper[4775]: I1216 15:16:26.184697 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b12d393-83d8-4db9-8aee-517105ff8484","Type":"ContainerDied","Data":"6b7187048db9b4d234ae68b44be6c47b31e1b8bd37d0847e216348740a8d3729"} Dec 16 15:16:26 crc kubenswrapper[4775]: I1216 15:16:26.184622 4775 generic.go:334] "Generic (PLEG): container finished" podID="4b12d393-83d8-4db9-8aee-517105ff8484" containerID="6b7187048db9b4d234ae68b44be6c47b31e1b8bd37d0847e216348740a8d3729" exitCode=0 Dec 16 15:16:29 crc kubenswrapper[4775]: I1216 15:16:29.218204 4775 generic.go:334] "Generic (PLEG): container finished" podID="4b12d393-83d8-4db9-8aee-517105ff8484" containerID="9b3e98d922ff0030997f7c093c8c5083a66dad7204d5b26869dbc5e97f4249e8" exitCode=0 Dec 16 15:16:29 crc kubenswrapper[4775]: I1216 15:16:29.218253 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b12d393-83d8-4db9-8aee-517105ff8484","Type":"ContainerDied","Data":"9b3e98d922ff0030997f7c093c8c5083a66dad7204d5b26869dbc5e97f4249e8"} Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.598297 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.598631 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0a2a6a81-f246-4963-bfcc-40d974860cd4" containerName="glance-log" containerID="cri-o://fe97ccdd2b524fba889177fa7b4be60d1aff4252728e422879dd3ed648a7e03f" gracePeriod=30 Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.598736 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0a2a6a81-f246-4963-bfcc-40d974860cd4" containerName="glance-httpd" containerID="cri-o://e903035b3104b3f236a6ba889058883c4f7a80ef2bdebff8c661b0b0d43be75c" gracePeriod=30 Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.743731 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-59fcc7f56d-krpcl"] Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.744929 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-59fcc7f56d-krpcl" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.778919 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-59fcc7f56d-krpcl"] Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.814185 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-76487b4cc4-bpbgr"] Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.815729 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-76487b4cc4-bpbgr" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.824428 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlsm8\" (UniqueName: \"kubernetes.io/projected/262d5cc2-3677-4f62-aa93-60ccab4cf899-kube-api-access-rlsm8\") pod \"heat-engine-59fcc7f56d-krpcl\" (UID: \"262d5cc2-3677-4f62-aa93-60ccab4cf899\") " pod="openstack/heat-engine-59fcc7f56d-krpcl" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.824516 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/262d5cc2-3677-4f62-aa93-60ccab4cf899-config-data\") pod \"heat-engine-59fcc7f56d-krpcl\" (UID: \"262d5cc2-3677-4f62-aa93-60ccab4cf899\") " pod="openstack/heat-engine-59fcc7f56d-krpcl" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.824594 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/262d5cc2-3677-4f62-aa93-60ccab4cf899-combined-ca-bundle\") pod \"heat-engine-59fcc7f56d-krpcl\" (UID: \"262d5cc2-3677-4f62-aa93-60ccab4cf899\") " pod="openstack/heat-engine-59fcc7f56d-krpcl" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.824621 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/262d5cc2-3677-4f62-aa93-60ccab4cf899-config-data-custom\") pod \"heat-engine-59fcc7f56d-krpcl\" (UID: \"262d5cc2-3677-4f62-aa93-60ccab4cf899\") " pod="openstack/heat-engine-59fcc7f56d-krpcl" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.831974 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-59c7c5dfbf-8495r"] Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.834221 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.847997 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-76487b4cc4-bpbgr"] Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.872200 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-59c7c5dfbf-8495r"] Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.926016 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlsm8\" (UniqueName: \"kubernetes.io/projected/262d5cc2-3677-4f62-aa93-60ccab4cf899-kube-api-access-rlsm8\") pod \"heat-engine-59fcc7f56d-krpcl\" (UID: \"262d5cc2-3677-4f62-aa93-60ccab4cf899\") " pod="openstack/heat-engine-59fcc7f56d-krpcl" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.926086 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-config-data-custom\") pod \"heat-api-76487b4cc4-bpbgr\" (UID: \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\") " pod="openstack/heat-api-76487b4cc4-bpbgr" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.926117 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cntm2\" (UniqueName: \"kubernetes.io/projected/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-kube-api-access-cntm2\") pod \"heat-api-76487b4cc4-bpbgr\" (UID: \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\") " pod="openstack/heat-api-76487b4cc4-bpbgr" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.926151 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/262d5cc2-3677-4f62-aa93-60ccab4cf899-config-data\") pod \"heat-engine-59fcc7f56d-krpcl\" (UID: \"262d5cc2-3677-4f62-aa93-60ccab4cf899\") " pod="openstack/heat-engine-59fcc7f56d-krpcl" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.926169 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-combined-ca-bundle\") pod \"heat-api-76487b4cc4-bpbgr\" (UID: \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\") " pod="openstack/heat-api-76487b4cc4-bpbgr" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.926197 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402ffd59-e84f-4e09-9d8d-d89c6c788547-combined-ca-bundle\") pod \"heat-cfnapi-59c7c5dfbf-8495r\" (UID: \"402ffd59-e84f-4e09-9d8d-d89c6c788547\") " pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.926222 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njxrq\" (UniqueName: \"kubernetes.io/projected/402ffd59-e84f-4e09-9d8d-d89c6c788547-kube-api-access-njxrq\") pod \"heat-cfnapi-59c7c5dfbf-8495r\" (UID: \"402ffd59-e84f-4e09-9d8d-d89c6c788547\") " pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.926257 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-config-data\") pod \"heat-api-76487b4cc4-bpbgr\" (UID: \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\") " pod="openstack/heat-api-76487b4cc4-bpbgr" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.926284 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/402ffd59-e84f-4e09-9d8d-d89c6c788547-config-data-custom\") pod \"heat-cfnapi-59c7c5dfbf-8495r\" (UID: \"402ffd59-e84f-4e09-9d8d-d89c6c788547\") " pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.926303 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/402ffd59-e84f-4e09-9d8d-d89c6c788547-config-data\") pod \"heat-cfnapi-59c7c5dfbf-8495r\" (UID: \"402ffd59-e84f-4e09-9d8d-d89c6c788547\") " pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.926336 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/262d5cc2-3677-4f62-aa93-60ccab4cf899-combined-ca-bundle\") pod \"heat-engine-59fcc7f56d-krpcl\" (UID: \"262d5cc2-3677-4f62-aa93-60ccab4cf899\") " pod="openstack/heat-engine-59fcc7f56d-krpcl" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.926368 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/262d5cc2-3677-4f62-aa93-60ccab4cf899-config-data-custom\") pod \"heat-engine-59fcc7f56d-krpcl\" (UID: \"262d5cc2-3677-4f62-aa93-60ccab4cf899\") " pod="openstack/heat-engine-59fcc7f56d-krpcl" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.934480 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/262d5cc2-3677-4f62-aa93-60ccab4cf899-config-data\") pod \"heat-engine-59fcc7f56d-krpcl\" (UID: \"262d5cc2-3677-4f62-aa93-60ccab4cf899\") " pod="openstack/heat-engine-59fcc7f56d-krpcl" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.935252 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/262d5cc2-3677-4f62-aa93-60ccab4cf899-config-data-custom\") pod \"heat-engine-59fcc7f56d-krpcl\" (UID: \"262d5cc2-3677-4f62-aa93-60ccab4cf899\") " pod="openstack/heat-engine-59fcc7f56d-krpcl" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.935697 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/262d5cc2-3677-4f62-aa93-60ccab4cf899-combined-ca-bundle\") pod \"heat-engine-59fcc7f56d-krpcl\" (UID: \"262d5cc2-3677-4f62-aa93-60ccab4cf899\") " pod="openstack/heat-engine-59fcc7f56d-krpcl" Dec 16 15:16:30 crc kubenswrapper[4775]: I1216 15:16:30.949130 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlsm8\" (UniqueName: \"kubernetes.io/projected/262d5cc2-3677-4f62-aa93-60ccab4cf899-kube-api-access-rlsm8\") pod \"heat-engine-59fcc7f56d-krpcl\" (UID: \"262d5cc2-3677-4f62-aa93-60ccab4cf899\") " pod="openstack/heat-engine-59fcc7f56d-krpcl" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.028323 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njxrq\" (UniqueName: \"kubernetes.io/projected/402ffd59-e84f-4e09-9d8d-d89c6c788547-kube-api-access-njxrq\") pod \"heat-cfnapi-59c7c5dfbf-8495r\" (UID: \"402ffd59-e84f-4e09-9d8d-d89c6c788547\") " pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.028403 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-config-data\") pod \"heat-api-76487b4cc4-bpbgr\" (UID: \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\") " pod="openstack/heat-api-76487b4cc4-bpbgr" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.028444 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/402ffd59-e84f-4e09-9d8d-d89c6c788547-config-data-custom\") pod \"heat-cfnapi-59c7c5dfbf-8495r\" (UID: \"402ffd59-e84f-4e09-9d8d-d89c6c788547\") " pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.028464 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/402ffd59-e84f-4e09-9d8d-d89c6c788547-config-data\") pod \"heat-cfnapi-59c7c5dfbf-8495r\" (UID: \"402ffd59-e84f-4e09-9d8d-d89c6c788547\") " pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.028544 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-config-data-custom\") pod \"heat-api-76487b4cc4-bpbgr\" (UID: \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\") " pod="openstack/heat-api-76487b4cc4-bpbgr" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.028568 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cntm2\" (UniqueName: \"kubernetes.io/projected/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-kube-api-access-cntm2\") pod \"heat-api-76487b4cc4-bpbgr\" (UID: \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\") " pod="openstack/heat-api-76487b4cc4-bpbgr" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.028598 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-combined-ca-bundle\") pod \"heat-api-76487b4cc4-bpbgr\" (UID: \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\") " pod="openstack/heat-api-76487b4cc4-bpbgr" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.028626 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402ffd59-e84f-4e09-9d8d-d89c6c788547-combined-ca-bundle\") pod \"heat-cfnapi-59c7c5dfbf-8495r\" (UID: \"402ffd59-e84f-4e09-9d8d-d89c6c788547\") " pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.039802 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/402ffd59-e84f-4e09-9d8d-d89c6c788547-config-data-custom\") pod \"heat-cfnapi-59c7c5dfbf-8495r\" (UID: \"402ffd59-e84f-4e09-9d8d-d89c6c788547\") " pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.048879 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402ffd59-e84f-4e09-9d8d-d89c6c788547-combined-ca-bundle\") pod \"heat-cfnapi-59c7c5dfbf-8495r\" (UID: \"402ffd59-e84f-4e09-9d8d-d89c6c788547\") " pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.055432 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/402ffd59-e84f-4e09-9d8d-d89c6c788547-config-data\") pod \"heat-cfnapi-59c7c5dfbf-8495r\" (UID: \"402ffd59-e84f-4e09-9d8d-d89c6c788547\") " pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.056290 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-config-data\") pod \"heat-api-76487b4cc4-bpbgr\" (UID: \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\") " pod="openstack/heat-api-76487b4cc4-bpbgr" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.056586 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-combined-ca-bundle\") pod \"heat-api-76487b4cc4-bpbgr\" (UID: \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\") " pod="openstack/heat-api-76487b4cc4-bpbgr" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.061769 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-config-data-custom\") pod \"heat-api-76487b4cc4-bpbgr\" (UID: \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\") " pod="openstack/heat-api-76487b4cc4-bpbgr" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.061779 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cntm2\" (UniqueName: \"kubernetes.io/projected/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-kube-api-access-cntm2\") pod \"heat-api-76487b4cc4-bpbgr\" (UID: \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\") " pod="openstack/heat-api-76487b4cc4-bpbgr" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.062510 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njxrq\" (UniqueName: \"kubernetes.io/projected/402ffd59-e84f-4e09-9d8d-d89c6c788547-kube-api-access-njxrq\") pod \"heat-cfnapi-59c7c5dfbf-8495r\" (UID: \"402ffd59-e84f-4e09-9d8d-d89c6c788547\") " pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.093490 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-59fcc7f56d-krpcl" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.148678 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-76487b4cc4-bpbgr" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.170517 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.250214 4775 generic.go:334] "Generic (PLEG): container finished" podID="0a2a6a81-f246-4963-bfcc-40d974860cd4" containerID="fe97ccdd2b524fba889177fa7b4be60d1aff4252728e422879dd3ed648a7e03f" exitCode=143 Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.250261 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0a2a6a81-f246-4963-bfcc-40d974860cd4","Type":"ContainerDied","Data":"fe97ccdd2b524fba889177fa7b4be60d1aff4252728e422879dd3ed648a7e03f"} Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.279509 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.594754 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.595140 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-78c6fdf4b7-xxfgx" Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.932129 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.932620 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cae742d1-aa2b-4462-a2ca-e3b73d58d564" containerName="glance-log" containerID="cri-o://2ba717144947e6116cbb8342108d99471e1beb3f36cdefc8beba61b3c7ecf227" gracePeriod=30 Dec 16 15:16:31 crc kubenswrapper[4775]: I1216 15:16:31.933103 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cae742d1-aa2b-4462-a2ca-e3b73d58d564" containerName="glance-httpd" containerID="cri-o://298cb301eb8afbe6b4e5205af068322e1c3698f9b518c92ec1229c8568754020" gracePeriod=30 Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.141564 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-svtt9"] Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.144039 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-svtt9" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.216872 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-svtt9"] Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.245838 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-68dw8"] Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.247742 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-68dw8" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.259378 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-9435-account-create-update-td6k7"] Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.261307 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9435-account-create-update-td6k7" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.264586 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-68dw8"] Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.273724 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.275874 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9435-account-create-update-td6k7"] Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.290428 4775 generic.go:334] "Generic (PLEG): container finished" podID="cae742d1-aa2b-4462-a2ca-e3b73d58d564" containerID="2ba717144947e6116cbb8342108d99471e1beb3f36cdefc8beba61b3c7ecf227" exitCode=143 Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.290745 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cae742d1-aa2b-4462-a2ca-e3b73d58d564","Type":"ContainerDied","Data":"2ba717144947e6116cbb8342108d99471e1beb3f36cdefc8beba61b3c7ecf227"} Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.297030 4775 generic.go:334] "Generic (PLEG): container finished" podID="799c9224-8212-452d-83c5-238ad4a6ed31" containerID="90d5598e6ff11e8b0a2b85ff321a12c25134c57df5752ad720223a7e460952fd" exitCode=0 Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.297118 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" event={"ID":"799c9224-8212-452d-83c5-238ad4a6ed31","Type":"ContainerDied","Data":"90d5598e6ff11e8b0a2b85ff321a12c25134c57df5752ad720223a7e460952fd"} Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.304356 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2gmj\" (UniqueName: \"kubernetes.io/projected/27246978-4a11-4370-9635-71ef44e99b6c-kube-api-access-v2gmj\") pod \"nova-api-db-create-svtt9\" (UID: \"27246978-4a11-4370-9635-71ef44e99b6c\") " pod="openstack/nova-api-db-create-svtt9" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.304758 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27246978-4a11-4370-9635-71ef44e99b6c-operator-scripts\") pod \"nova-api-db-create-svtt9\" (UID: \"27246978-4a11-4370-9635-71ef44e99b6c\") " pod="openstack/nova-api-db-create-svtt9" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.308108 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-78c456ddf7-dbpj2" event={"ID":"e42b8d55-b8c7-4982-9cef-714706199d4a","Type":"ContainerStarted","Data":"c2c049ee0fafef25606f61695fc4588b5ca2c4e8c6a25b19b3dc7ee546c2a37d"} Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.311519 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-65646f4f55-j22ds" event={"ID":"76beac71-bf66-45ec-8a1f-5f6ed8122888","Type":"ContainerStarted","Data":"fcf168b3bd8082fdb2c3280cddc8f76ef170ba21bfb6fbece69fe031662c4f40"} Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.313842 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-c49bc9464-wb445" event={"ID":"eb13cacc-e521-4220-a731-18136d35425c","Type":"ContainerStarted","Data":"8906e4b9e85b5b63b8682c82f8bb1a39eecc76a007eefd56a8310c2c1fc594a8"} Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.324378 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-c49bc9464-wb445" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.413169 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.1452523709999998 podStartE2EDuration="16.413145936s" podCreationTimestamp="2025-12-16 15:16:16 +0000 UTC" firstStartedPulling="2025-12-16 15:16:17.526399276 +0000 UTC m=+1302.477478199" lastFinishedPulling="2025-12-16 15:16:31.794292841 +0000 UTC m=+1316.745371764" observedRunningTime="2025-12-16 15:16:32.342365913 +0000 UTC m=+1317.293444836" watchObservedRunningTime="2025-12-16 15:16:32.413145936 +0000 UTC m=+1317.364224859" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.414866 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq74f\" (UniqueName: \"kubernetes.io/projected/c18fb7c5-60d7-497d-8031-4d3c073104a6-kube-api-access-fq74f\") pod \"nova-api-9435-account-create-update-td6k7\" (UID: \"c18fb7c5-60d7-497d-8031-4d3c073104a6\") " pod="openstack/nova-api-9435-account-create-update-td6k7" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.414987 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5bxr\" (UniqueName: \"kubernetes.io/projected/454beaa2-a30a-4b5f-bb64-95eafaa20360-kube-api-access-f5bxr\") pod \"nova-cell0-db-create-68dw8\" (UID: \"454beaa2-a30a-4b5f-bb64-95eafaa20360\") " pod="openstack/nova-cell0-db-create-68dw8" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.415079 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c18fb7c5-60d7-497d-8031-4d3c073104a6-operator-scripts\") pod \"nova-api-9435-account-create-update-td6k7\" (UID: \"c18fb7c5-60d7-497d-8031-4d3c073104a6\") " pod="openstack/nova-api-9435-account-create-update-td6k7" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.415159 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27246978-4a11-4370-9635-71ef44e99b6c-operator-scripts\") pod \"nova-api-db-create-svtt9\" (UID: \"27246978-4a11-4370-9635-71ef44e99b6c\") " pod="openstack/nova-api-db-create-svtt9" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.415231 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2gmj\" (UniqueName: \"kubernetes.io/projected/27246978-4a11-4370-9635-71ef44e99b6c-kube-api-access-v2gmj\") pod \"nova-api-db-create-svtt9\" (UID: \"27246978-4a11-4370-9635-71ef44e99b6c\") " pod="openstack/nova-api-db-create-svtt9" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.415489 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/454beaa2-a30a-4b5f-bb64-95eafaa20360-operator-scripts\") pod \"nova-cell0-db-create-68dw8\" (UID: \"454beaa2-a30a-4b5f-bb64-95eafaa20360\") " pod="openstack/nova-cell0-db-create-68dw8" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.416428 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27246978-4a11-4370-9635-71ef44e99b6c-operator-scripts\") pod \"nova-api-db-create-svtt9\" (UID: \"27246978-4a11-4370-9635-71ef44e99b6c\") " pod="openstack/nova-api-db-create-svtt9" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.438620 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-c49bc9464-wb445" podStartSLOduration=9.438592739 podStartE2EDuration="9.438592739s" podCreationTimestamp="2025-12-16 15:16:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:32.361011481 +0000 UTC m=+1317.312090424" watchObservedRunningTime="2025-12-16 15:16:32.438592739 +0000 UTC m=+1317.389671662" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.439176 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2gmj\" (UniqueName: \"kubernetes.io/projected/27246978-4a11-4370-9635-71ef44e99b6c-kube-api-access-v2gmj\") pod \"nova-api-db-create-svtt9\" (UID: \"27246978-4a11-4370-9635-71ef44e99b6c\") " pod="openstack/nova-api-db-create-svtt9" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.460279 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-8vlhw"] Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.464820 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8vlhw" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.483341 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-52a3-account-create-update-zq6jv"] Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.484667 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-52a3-account-create-update-zq6jv" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.490382 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.492734 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.497828 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-svtt9" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.504440 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8vlhw"] Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.513717 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-52a3-account-create-update-zq6jv"] Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.517587 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c18fb7c5-60d7-497d-8031-4d3c073104a6-operator-scripts\") pod \"nova-api-9435-account-create-update-td6k7\" (UID: \"c18fb7c5-60d7-497d-8031-4d3c073104a6\") " pod="openstack/nova-api-9435-account-create-update-td6k7" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.518143 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/454beaa2-a30a-4b5f-bb64-95eafaa20360-operator-scripts\") pod \"nova-cell0-db-create-68dw8\" (UID: \"454beaa2-a30a-4b5f-bb64-95eafaa20360\") " pod="openstack/nova-cell0-db-create-68dw8" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.518249 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq74f\" (UniqueName: \"kubernetes.io/projected/c18fb7c5-60d7-497d-8031-4d3c073104a6-kube-api-access-fq74f\") pod \"nova-api-9435-account-create-update-td6k7\" (UID: \"c18fb7c5-60d7-497d-8031-4d3c073104a6\") " pod="openstack/nova-api-9435-account-create-update-td6k7" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.518396 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5bxr\" (UniqueName: \"kubernetes.io/projected/454beaa2-a30a-4b5f-bb64-95eafaa20360-kube-api-access-f5bxr\") pod \"nova-cell0-db-create-68dw8\" (UID: \"454beaa2-a30a-4b5f-bb64-95eafaa20360\") " pod="openstack/nova-cell0-db-create-68dw8" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.519931 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c18fb7c5-60d7-497d-8031-4d3c073104a6-operator-scripts\") pod \"nova-api-9435-account-create-update-td6k7\" (UID: \"c18fb7c5-60d7-497d-8031-4d3c073104a6\") " pod="openstack/nova-api-9435-account-create-update-td6k7" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.520482 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/454beaa2-a30a-4b5f-bb64-95eafaa20360-operator-scripts\") pod \"nova-cell0-db-create-68dw8\" (UID: \"454beaa2-a30a-4b5f-bb64-95eafaa20360\") " pod="openstack/nova-cell0-db-create-68dw8" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.539734 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-76487b4cc4-bpbgr"] Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.545609 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5bxr\" (UniqueName: \"kubernetes.io/projected/454beaa2-a30a-4b5f-bb64-95eafaa20360-kube-api-access-f5bxr\") pod \"nova-cell0-db-create-68dw8\" (UID: \"454beaa2-a30a-4b5f-bb64-95eafaa20360\") " pod="openstack/nova-cell0-db-create-68dw8" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.546446 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq74f\" (UniqueName: \"kubernetes.io/projected/c18fb7c5-60d7-497d-8031-4d3c073104a6-kube-api-access-fq74f\") pod \"nova-api-9435-account-create-update-td6k7\" (UID: \"c18fb7c5-60d7-497d-8031-4d3c073104a6\") " pod="openstack/nova-api-9435-account-create-update-td6k7" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.618908 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-41a9-account-create-update-fnj2g"] Dec 16 15:16:32 crc kubenswrapper[4775]: E1216 15:16:32.620027 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b12d393-83d8-4db9-8aee-517105ff8484" containerName="ceilometer-central-agent" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.620075 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b12d393-83d8-4db9-8aee-517105ff8484" containerName="ceilometer-central-agent" Dec 16 15:16:32 crc kubenswrapper[4775]: E1216 15:16:32.620095 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b12d393-83d8-4db9-8aee-517105ff8484" containerName="ceilometer-notification-agent" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.620104 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b12d393-83d8-4db9-8aee-517105ff8484" containerName="ceilometer-notification-agent" Dec 16 15:16:32 crc kubenswrapper[4775]: E1216 15:16:32.620118 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b12d393-83d8-4db9-8aee-517105ff8484" containerName="proxy-httpd" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.620146 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b12d393-83d8-4db9-8aee-517105ff8484" containerName="proxy-httpd" Dec 16 15:16:32 crc kubenswrapper[4775]: E1216 15:16:32.620156 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b12d393-83d8-4db9-8aee-517105ff8484" containerName="sg-core" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.620163 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b12d393-83d8-4db9-8aee-517105ff8484" containerName="sg-core" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.620361 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b12d393-83d8-4db9-8aee-517105ff8484" containerName="ceilometer-notification-agent" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.620380 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b12d393-83d8-4db9-8aee-517105ff8484" containerName="ceilometer-central-agent" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.620395 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b12d393-83d8-4db9-8aee-517105ff8484" containerName="sg-core" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.620404 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b12d393-83d8-4db9-8aee-517105ff8484" containerName="proxy-httpd" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.621113 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-41a9-account-create-update-fnj2g" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.622579 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-scripts\") pod \"4b12d393-83d8-4db9-8aee-517105ff8484\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.622768 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b12d393-83d8-4db9-8aee-517105ff8484-log-httpd\") pod \"4b12d393-83d8-4db9-8aee-517105ff8484\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.622841 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-sg-core-conf-yaml\") pod \"4b12d393-83d8-4db9-8aee-517105ff8484\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.622899 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-config-data\") pod \"4b12d393-83d8-4db9-8aee-517105ff8484\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.622952 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-combined-ca-bundle\") pod \"4b12d393-83d8-4db9-8aee-517105ff8484\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.623036 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b12d393-83d8-4db9-8aee-517105ff8484-run-httpd\") pod \"4b12d393-83d8-4db9-8aee-517105ff8484\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.623202 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smnrg\" (UniqueName: \"kubernetes.io/projected/4b12d393-83d8-4db9-8aee-517105ff8484-kube-api-access-smnrg\") pod \"4b12d393-83d8-4db9-8aee-517105ff8484\" (UID: \"4b12d393-83d8-4db9-8aee-517105ff8484\") " Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.623486 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b12d393-83d8-4db9-8aee-517105ff8484-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4b12d393-83d8-4db9-8aee-517105ff8484" (UID: "4b12d393-83d8-4db9-8aee-517105ff8484"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.623509 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndk5g\" (UniqueName: \"kubernetes.io/projected/382550fb-c9fc-4100-a196-8ab11975d0ad-kube-api-access-ndk5g\") pod \"nova-cell1-db-create-8vlhw\" (UID: \"382550fb-c9fc-4100-a196-8ab11975d0ad\") " pod="openstack/nova-cell1-db-create-8vlhw" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.623607 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mfp2\" (UniqueName: \"kubernetes.io/projected/b5c39639-d2db-4abd-a474-d141a0d0af35-kube-api-access-6mfp2\") pod \"nova-cell0-52a3-account-create-update-zq6jv\" (UID: \"b5c39639-d2db-4abd-a474-d141a0d0af35\") " pod="openstack/nova-cell0-52a3-account-create-update-zq6jv" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.623756 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5c39639-d2db-4abd-a474-d141a0d0af35-operator-scripts\") pod \"nova-cell0-52a3-account-create-update-zq6jv\" (UID: \"b5c39639-d2db-4abd-a474-d141a0d0af35\") " pod="openstack/nova-cell0-52a3-account-create-update-zq6jv" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.623775 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b12d393-83d8-4db9-8aee-517105ff8484-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4b12d393-83d8-4db9-8aee-517105ff8484" (UID: "4b12d393-83d8-4db9-8aee-517105ff8484"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.623824 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/382550fb-c9fc-4100-a196-8ab11975d0ad-operator-scripts\") pod \"nova-cell1-db-create-8vlhw\" (UID: \"382550fb-c9fc-4100-a196-8ab11975d0ad\") " pod="openstack/nova-cell1-db-create-8vlhw" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.623904 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b12d393-83d8-4db9-8aee-517105ff8484-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.623916 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b12d393-83d8-4db9-8aee-517105ff8484-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.628095 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.630466 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b12d393-83d8-4db9-8aee-517105ff8484-kube-api-access-smnrg" (OuterVolumeSpecName: "kube-api-access-smnrg") pod "4b12d393-83d8-4db9-8aee-517105ff8484" (UID: "4b12d393-83d8-4db9-8aee-517105ff8484"). InnerVolumeSpecName "kube-api-access-smnrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.632149 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-scripts" (OuterVolumeSpecName: "scripts") pod "4b12d393-83d8-4db9-8aee-517105ff8484" (UID: "4b12d393-83d8-4db9-8aee-517105ff8484"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.693644 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-41a9-account-create-update-fnj2g"] Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.717124 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4b12d393-83d8-4db9-8aee-517105ff8484" (UID: "4b12d393-83d8-4db9-8aee-517105ff8484"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.730783 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mfp2\" (UniqueName: \"kubernetes.io/projected/b5c39639-d2db-4abd-a474-d141a0d0af35-kube-api-access-6mfp2\") pod \"nova-cell0-52a3-account-create-update-zq6jv\" (UID: \"b5c39639-d2db-4abd-a474-d141a0d0af35\") " pod="openstack/nova-cell0-52a3-account-create-update-zq6jv" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.731012 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx94s\" (UniqueName: \"kubernetes.io/projected/a87cdc28-aa31-4446-a1a6-e0904f9daa62-kube-api-access-gx94s\") pod \"nova-cell1-41a9-account-create-update-fnj2g\" (UID: \"a87cdc28-aa31-4446-a1a6-e0904f9daa62\") " pod="openstack/nova-cell1-41a9-account-create-update-fnj2g" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.731211 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a87cdc28-aa31-4446-a1a6-e0904f9daa62-operator-scripts\") pod \"nova-cell1-41a9-account-create-update-fnj2g\" (UID: \"a87cdc28-aa31-4446-a1a6-e0904f9daa62\") " pod="openstack/nova-cell1-41a9-account-create-update-fnj2g" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.731272 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5c39639-d2db-4abd-a474-d141a0d0af35-operator-scripts\") pod \"nova-cell0-52a3-account-create-update-zq6jv\" (UID: \"b5c39639-d2db-4abd-a474-d141a0d0af35\") " pod="openstack/nova-cell0-52a3-account-create-update-zq6jv" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.731332 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/382550fb-c9fc-4100-a196-8ab11975d0ad-operator-scripts\") pod \"nova-cell1-db-create-8vlhw\" (UID: \"382550fb-c9fc-4100-a196-8ab11975d0ad\") " pod="openstack/nova-cell1-db-create-8vlhw" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.731431 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndk5g\" (UniqueName: \"kubernetes.io/projected/382550fb-c9fc-4100-a196-8ab11975d0ad-kube-api-access-ndk5g\") pod \"nova-cell1-db-create-8vlhw\" (UID: \"382550fb-c9fc-4100-a196-8ab11975d0ad\") " pod="openstack/nova-cell1-db-create-8vlhw" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.731632 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.731649 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smnrg\" (UniqueName: \"kubernetes.io/projected/4b12d393-83d8-4db9-8aee-517105ff8484-kube-api-access-smnrg\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.731661 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.731988 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5c39639-d2db-4abd-a474-d141a0d0af35-operator-scripts\") pod \"nova-cell0-52a3-account-create-update-zq6jv\" (UID: \"b5c39639-d2db-4abd-a474-d141a0d0af35\") " pod="openstack/nova-cell0-52a3-account-create-update-zq6jv" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.732526 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/382550fb-c9fc-4100-a196-8ab11975d0ad-operator-scripts\") pod \"nova-cell1-db-create-8vlhw\" (UID: \"382550fb-c9fc-4100-a196-8ab11975d0ad\") " pod="openstack/nova-cell1-db-create-8vlhw" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.747607 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-68dw8" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.750260 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mfp2\" (UniqueName: \"kubernetes.io/projected/b5c39639-d2db-4abd-a474-d141a0d0af35-kube-api-access-6mfp2\") pod \"nova-cell0-52a3-account-create-update-zq6jv\" (UID: \"b5c39639-d2db-4abd-a474-d141a0d0af35\") " pod="openstack/nova-cell0-52a3-account-create-update-zq6jv" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.757409 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndk5g\" (UniqueName: \"kubernetes.io/projected/382550fb-c9fc-4100-a196-8ab11975d0ad-kube-api-access-ndk5g\") pod \"nova-cell1-db-create-8vlhw\" (UID: \"382550fb-c9fc-4100-a196-8ab11975d0ad\") " pod="openstack/nova-cell1-db-create-8vlhw" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.757855 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9435-account-create-update-td6k7" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.764978 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-59c7c5dfbf-8495r"] Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.790492 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b12d393-83d8-4db9-8aee-517105ff8484" (UID: "4b12d393-83d8-4db9-8aee-517105ff8484"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.792077 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8vlhw" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.812942 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-59fcc7f56d-krpcl"] Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.817504 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-52a3-account-create-update-zq6jv" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.821033 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-config-data" (OuterVolumeSpecName: "config-data") pod "4b12d393-83d8-4db9-8aee-517105ff8484" (UID: "4b12d393-83d8-4db9-8aee-517105ff8484"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.834179 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx94s\" (UniqueName: \"kubernetes.io/projected/a87cdc28-aa31-4446-a1a6-e0904f9daa62-kube-api-access-gx94s\") pod \"nova-cell1-41a9-account-create-update-fnj2g\" (UID: \"a87cdc28-aa31-4446-a1a6-e0904f9daa62\") " pod="openstack/nova-cell1-41a9-account-create-update-fnj2g" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.834289 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a87cdc28-aa31-4446-a1a6-e0904f9daa62-operator-scripts\") pod \"nova-cell1-41a9-account-create-update-fnj2g\" (UID: \"a87cdc28-aa31-4446-a1a6-e0904f9daa62\") " pod="openstack/nova-cell1-41a9-account-create-update-fnj2g" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.834397 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.834411 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b12d393-83d8-4db9-8aee-517105ff8484-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.835131 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a87cdc28-aa31-4446-a1a6-e0904f9daa62-operator-scripts\") pod \"nova-cell1-41a9-account-create-update-fnj2g\" (UID: \"a87cdc28-aa31-4446-a1a6-e0904f9daa62\") " pod="openstack/nova-cell1-41a9-account-create-update-fnj2g" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.854332 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx94s\" (UniqueName: \"kubernetes.io/projected/a87cdc28-aa31-4446-a1a6-e0904f9daa62-kube-api-access-gx94s\") pod \"nova-cell1-41a9-account-create-update-fnj2g\" (UID: \"a87cdc28-aa31-4446-a1a6-e0904f9daa62\") " pod="openstack/nova-cell1-41a9-account-create-update-fnj2g" Dec 16 15:16:32 crc kubenswrapper[4775]: I1216 15:16:32.956717 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-41a9-account-create-update-fnj2g" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.117445 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-svtt9"] Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.366443 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-svtt9" event={"ID":"27246978-4a11-4370-9635-71ef44e99b6c","Type":"ContainerStarted","Data":"26eaad3fb57436593f69c443bd5820ca0b5a45c68a8b4f4d794e7019205fc99a"} Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.366956 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-59fcc7f56d-krpcl" event={"ID":"262d5cc2-3677-4f62-aa93-60ccab4cf899","Type":"ContainerStarted","Data":"69a4642b5ae973e5f046fb76018935d4b1b85f448e9cfb2445cac258d57ffef4"} Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.377335 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" event={"ID":"402ffd59-e84f-4e09-9d8d-d89c6c788547","Type":"ContainerStarted","Data":"d76fa5d418702bdf688b714aa03bb5b5117f367aef89cd62b9e660be65f98183"} Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.393077 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b12d393-83d8-4db9-8aee-517105ff8484","Type":"ContainerDied","Data":"e080563d60cae6b51da7793507eba07acbe648bbee40ae0185865ffc41d0c36f"} Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.393121 4775 scope.go:117] "RemoveContainer" containerID="79283e9e5c95601260fa349de4b8002d22b523e76312c76682c46ac799e6d7de" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.393458 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.398981 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-76487b4cc4-bpbgr" event={"ID":"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d","Type":"ContainerStarted","Data":"48fcc6b7ca8bdf9ef650e2d3c90a224483e8293a1b2d585269f3afe1f0bcb240"} Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.403742 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" event={"ID":"799c9224-8212-452d-83c5-238ad4a6ed31","Type":"ContainerStarted","Data":"712d3f3ed3b67e20fc681276a21bff051e6251dfcf09c0306990f18602c2fa8a"} Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.404729 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.417258 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a9f81b8a-3b7e-4984-946f-2de17873b97a","Type":"ContainerStarted","Data":"da28d0a5660fb944b518dcd5eabe1e1946946b71821283dda5a0a2d99a4db76d"} Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.426450 4775 scope.go:117] "RemoveContainer" containerID="b17246be99afe61b30e278dcf66d4e0e0e4995fd2071d7a46be4d8fba61e0353" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.457377 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.485237 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.488065 4775 scope.go:117] "RemoveContainer" containerID="9b3e98d922ff0030997f7c093c8c5083a66dad7204d5b26869dbc5e97f4249e8" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.510143 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.513057 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.516216 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" podStartSLOduration=10.516191607 podStartE2EDuration="10.516191607s" podCreationTimestamp="2025-12-16 15:16:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:33.449544254 +0000 UTC m=+1318.400623187" watchObservedRunningTime="2025-12-16 15:16:33.516191607 +0000 UTC m=+1318.467270530" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.516757 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.517310 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.581841 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.600950 4775 scope.go:117] "RemoveContainer" containerID="6b7187048db9b4d234ae68b44be6c47b31e1b8bd37d0847e216348740a8d3729" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.665398 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.665447 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct46f\" (UniqueName: \"kubernetes.io/projected/f3399b2a-e8f1-442d-be74-160e5524608b-kube-api-access-ct46f\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.665486 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-config-data\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.665610 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3399b2a-e8f1-442d-be74-160e5524608b-log-httpd\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.665674 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-scripts\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.665693 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3399b2a-e8f1-442d-be74-160e5524608b-run-httpd\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.665803 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.698959 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-78c456ddf7-dbpj2"] Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.723271 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-65646f4f55-j22ds"] Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.736968 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-55844f6789-qwjbq"] Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.739086 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.742122 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.742787 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.749075 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-55844f6789-qwjbq"] Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.765807 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-69676fb7c9-tmm27"] Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.767436 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.771367 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-scripts\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.771408 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3399b2a-e8f1-442d-be74-160e5524608b-run-httpd\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.771466 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.771520 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.771544 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct46f\" (UniqueName: \"kubernetes.io/projected/f3399b2a-e8f1-442d-be74-160e5524608b-kube-api-access-ct46f\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.771574 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-config-data\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.771641 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3399b2a-e8f1-442d-be74-160e5524608b-log-httpd\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.772363 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3399b2a-e8f1-442d-be74-160e5524608b-log-httpd\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.773215 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.773520 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.775061 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3399b2a-e8f1-442d-be74-160e5524608b-run-httpd\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.780657 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.812422 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-scripts\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.814599 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-69676fb7c9-tmm27"] Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.817087 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct46f\" (UniqueName: \"kubernetes.io/projected/f3399b2a-e8f1-442d-be74-160e5524608b-kube-api-access-ct46f\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.823691 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-config-data\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.828226 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.841114 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8vlhw"] Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.860524 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.861832 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9435-account-create-update-td6k7"] Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.873040 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bzlc\" (UniqueName: \"kubernetes.io/projected/ea336475-3963-43eb-9e16-814d0c717625-kube-api-access-9bzlc\") pod \"heat-api-55844f6789-qwjbq\" (UID: \"ea336475-3963-43eb-9e16-814d0c717625\") " pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.873104 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea336475-3963-43eb-9e16-814d0c717625-config-data\") pod \"heat-api-55844f6789-qwjbq\" (UID: \"ea336475-3963-43eb-9e16-814d0c717625\") " pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.873128 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea336475-3963-43eb-9e16-814d0c717625-config-data-custom\") pod \"heat-api-55844f6789-qwjbq\" (UID: \"ea336475-3963-43eb-9e16-814d0c717625\") " pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.873177 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9a8d05d-1353-46db-9367-c7205a7d39d9-config-data-custom\") pod \"heat-cfnapi-69676fb7c9-tmm27\" (UID: \"d9a8d05d-1353-46db-9367-c7205a7d39d9\") " pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.873203 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5wfh\" (UniqueName: \"kubernetes.io/projected/d9a8d05d-1353-46db-9367-c7205a7d39d9-kube-api-access-l5wfh\") pod \"heat-cfnapi-69676fb7c9-tmm27\" (UID: \"d9a8d05d-1353-46db-9367-c7205a7d39d9\") " pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.873241 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a8d05d-1353-46db-9367-c7205a7d39d9-config-data\") pod \"heat-cfnapi-69676fb7c9-tmm27\" (UID: \"d9a8d05d-1353-46db-9367-c7205a7d39d9\") " pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.873267 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9a8d05d-1353-46db-9367-c7205a7d39d9-internal-tls-certs\") pod \"heat-cfnapi-69676fb7c9-tmm27\" (UID: \"d9a8d05d-1353-46db-9367-c7205a7d39d9\") " pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.873281 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea336475-3963-43eb-9e16-814d0c717625-combined-ca-bundle\") pod \"heat-api-55844f6789-qwjbq\" (UID: \"ea336475-3963-43eb-9e16-814d0c717625\") " pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.873309 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea336475-3963-43eb-9e16-814d0c717625-internal-tls-certs\") pod \"heat-api-55844f6789-qwjbq\" (UID: \"ea336475-3963-43eb-9e16-814d0c717625\") " pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.873332 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9a8d05d-1353-46db-9367-c7205a7d39d9-public-tls-certs\") pod \"heat-cfnapi-69676fb7c9-tmm27\" (UID: \"d9a8d05d-1353-46db-9367-c7205a7d39d9\") " pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.873594 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a8d05d-1353-46db-9367-c7205a7d39d9-combined-ca-bundle\") pod \"heat-cfnapi-69676fb7c9-tmm27\" (UID: \"d9a8d05d-1353-46db-9367-c7205a7d39d9\") " pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.873731 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea336475-3963-43eb-9e16-814d0c717625-public-tls-certs\") pod \"heat-api-55844f6789-qwjbq\" (UID: \"ea336475-3963-43eb-9e16-814d0c717625\") " pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.884411 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-68dw8"] Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.910736 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-52a3-account-create-update-zq6jv"] Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.927620 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-41a9-account-create-update-fnj2g"] Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.976538 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5wfh\" (UniqueName: \"kubernetes.io/projected/d9a8d05d-1353-46db-9367-c7205a7d39d9-kube-api-access-l5wfh\") pod \"heat-cfnapi-69676fb7c9-tmm27\" (UID: \"d9a8d05d-1353-46db-9367-c7205a7d39d9\") " pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.976583 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a8d05d-1353-46db-9367-c7205a7d39d9-config-data\") pod \"heat-cfnapi-69676fb7c9-tmm27\" (UID: \"d9a8d05d-1353-46db-9367-c7205a7d39d9\") " pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.976616 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9a8d05d-1353-46db-9367-c7205a7d39d9-internal-tls-certs\") pod \"heat-cfnapi-69676fb7c9-tmm27\" (UID: \"d9a8d05d-1353-46db-9367-c7205a7d39d9\") " pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.976646 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea336475-3963-43eb-9e16-814d0c717625-combined-ca-bundle\") pod \"heat-api-55844f6789-qwjbq\" (UID: \"ea336475-3963-43eb-9e16-814d0c717625\") " pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.976672 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea336475-3963-43eb-9e16-814d0c717625-internal-tls-certs\") pod \"heat-api-55844f6789-qwjbq\" (UID: \"ea336475-3963-43eb-9e16-814d0c717625\") " pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.976695 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9a8d05d-1353-46db-9367-c7205a7d39d9-public-tls-certs\") pod \"heat-cfnapi-69676fb7c9-tmm27\" (UID: \"d9a8d05d-1353-46db-9367-c7205a7d39d9\") " pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.976742 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a8d05d-1353-46db-9367-c7205a7d39d9-combined-ca-bundle\") pod \"heat-cfnapi-69676fb7c9-tmm27\" (UID: \"d9a8d05d-1353-46db-9367-c7205a7d39d9\") " pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.976774 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea336475-3963-43eb-9e16-814d0c717625-public-tls-certs\") pod \"heat-api-55844f6789-qwjbq\" (UID: \"ea336475-3963-43eb-9e16-814d0c717625\") " pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.976830 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bzlc\" (UniqueName: \"kubernetes.io/projected/ea336475-3963-43eb-9e16-814d0c717625-kube-api-access-9bzlc\") pod \"heat-api-55844f6789-qwjbq\" (UID: \"ea336475-3963-43eb-9e16-814d0c717625\") " pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.976859 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea336475-3963-43eb-9e16-814d0c717625-config-data\") pod \"heat-api-55844f6789-qwjbq\" (UID: \"ea336475-3963-43eb-9e16-814d0c717625\") " pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.976886 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea336475-3963-43eb-9e16-814d0c717625-config-data-custom\") pod \"heat-api-55844f6789-qwjbq\" (UID: \"ea336475-3963-43eb-9e16-814d0c717625\") " pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.976927 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9a8d05d-1353-46db-9367-c7205a7d39d9-config-data-custom\") pod \"heat-cfnapi-69676fb7c9-tmm27\" (UID: \"d9a8d05d-1353-46db-9367-c7205a7d39d9\") " pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.993592 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea336475-3963-43eb-9e16-814d0c717625-combined-ca-bundle\") pod \"heat-api-55844f6789-qwjbq\" (UID: \"ea336475-3963-43eb-9e16-814d0c717625\") " pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:33 crc kubenswrapper[4775]: I1216 15:16:33.995056 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a8d05d-1353-46db-9367-c7205a7d39d9-combined-ca-bundle\") pod \"heat-cfnapi-69676fb7c9-tmm27\" (UID: \"d9a8d05d-1353-46db-9367-c7205a7d39d9\") " pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.001364 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea336475-3963-43eb-9e16-814d0c717625-internal-tls-certs\") pod \"heat-api-55844f6789-qwjbq\" (UID: \"ea336475-3963-43eb-9e16-814d0c717625\") " pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.004197 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea336475-3963-43eb-9e16-814d0c717625-public-tls-certs\") pod \"heat-api-55844f6789-qwjbq\" (UID: \"ea336475-3963-43eb-9e16-814d0c717625\") " pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.004559 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9a8d05d-1353-46db-9367-c7205a7d39d9-public-tls-certs\") pod \"heat-cfnapi-69676fb7c9-tmm27\" (UID: \"d9a8d05d-1353-46db-9367-c7205a7d39d9\") " pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.005209 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a8d05d-1353-46db-9367-c7205a7d39d9-config-data\") pod \"heat-cfnapi-69676fb7c9-tmm27\" (UID: \"d9a8d05d-1353-46db-9367-c7205a7d39d9\") " pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.005249 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9a8d05d-1353-46db-9367-c7205a7d39d9-config-data-custom\") pod \"heat-cfnapi-69676fb7c9-tmm27\" (UID: \"d9a8d05d-1353-46db-9367-c7205a7d39d9\") " pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.005788 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5wfh\" (UniqueName: \"kubernetes.io/projected/d9a8d05d-1353-46db-9367-c7205a7d39d9-kube-api-access-l5wfh\") pod \"heat-cfnapi-69676fb7c9-tmm27\" (UID: \"d9a8d05d-1353-46db-9367-c7205a7d39d9\") " pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.006758 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea336475-3963-43eb-9e16-814d0c717625-config-data\") pod \"heat-api-55844f6789-qwjbq\" (UID: \"ea336475-3963-43eb-9e16-814d0c717625\") " pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.012638 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea336475-3963-43eb-9e16-814d0c717625-config-data-custom\") pod \"heat-api-55844f6789-qwjbq\" (UID: \"ea336475-3963-43eb-9e16-814d0c717625\") " pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.018410 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9a8d05d-1353-46db-9367-c7205a7d39d9-internal-tls-certs\") pod \"heat-cfnapi-69676fb7c9-tmm27\" (UID: \"d9a8d05d-1353-46db-9367-c7205a7d39d9\") " pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.018523 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bzlc\" (UniqueName: \"kubernetes.io/projected/ea336475-3963-43eb-9e16-814d0c717625-kube-api-access-9bzlc\") pod \"heat-api-55844f6789-qwjbq\" (UID: \"ea336475-3963-43eb-9e16-814d0c717625\") " pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.111214 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.136443 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.203649 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.436728 4775 generic.go:334] "Generic (PLEG): container finished" podID="27246978-4a11-4370-9635-71ef44e99b6c" containerID="1346e61d1ead4c8b5724ad39b2813ecbc5c8bc421af45fe7093e0975737aa786" exitCode=0 Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.436804 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-svtt9" event={"ID":"27246978-4a11-4370-9635-71ef44e99b6c","Type":"ContainerDied","Data":"1346e61d1ead4c8b5724ad39b2813ecbc5c8bc421af45fe7093e0975737aa786"} Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.440871 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9435-account-create-update-td6k7" event={"ID":"c18fb7c5-60d7-497d-8031-4d3c073104a6","Type":"ContainerStarted","Data":"7865649c4b20ae8bea202168daa6882481e37dbbb48e96646f8d6a5da562f156"} Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.451004 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8vlhw" event={"ID":"382550fb-c9fc-4100-a196-8ab11975d0ad","Type":"ContainerStarted","Data":"7f2820803b30eb5fcb93179ee23c8d3177640fe2f3c423f4324a83d5acb92dad"} Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.455376 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-52a3-account-create-update-zq6jv" event={"ID":"b5c39639-d2db-4abd-a474-d141a0d0af35","Type":"ContainerStarted","Data":"ef48cf579301854bfb12b47d907a4bd3e91bb4f1fa55a1f39bbfd5df66060956"} Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.460016 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-41a9-account-create-update-fnj2g" event={"ID":"a87cdc28-aa31-4446-a1a6-e0904f9daa62","Type":"ContainerStarted","Data":"5f79622324d27df05cd2f7e6497a4c0e36295ff0ad0ad7dac5ab6430bf8465bb"} Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.463562 4775 generic.go:334] "Generic (PLEG): container finished" podID="0a2a6a81-f246-4963-bfcc-40d974860cd4" containerID="e903035b3104b3f236a6ba889058883c4f7a80ef2bdebff8c661b0b0d43be75c" exitCode=0 Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.463624 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0a2a6a81-f246-4963-bfcc-40d974860cd4","Type":"ContainerDied","Data":"e903035b3104b3f236a6ba889058883c4f7a80ef2bdebff8c661b0b0d43be75c"} Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.465515 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-68dw8" event={"ID":"454beaa2-a30a-4b5f-bb64-95eafaa20360","Type":"ContainerStarted","Data":"af4ae4b5be928f27149a43322fded1feb8100d688b9881855a3e88b4f6398880"} Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.467317 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-59fcc7f56d-krpcl" event={"ID":"262d5cc2-3677-4f62-aa93-60ccab4cf899","Type":"ContainerStarted","Data":"670826beddb12fe847f33298de94d9ccea284e264d11f62f76e0d9cb0c6e6e1a"} Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.467476 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-59fcc7f56d-krpcl" Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.471699 4775 generic.go:334] "Generic (PLEG): container finished" podID="8a5c773c-3b58-405f-a31f-b4b872509e1e" containerID="0ca9e71065dce630b808e7bc5b992ab95e7313eea2afb89b56d8fd5f2b3bee47" exitCode=137 Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.471786 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a5c773c-3b58-405f-a31f-b4b872509e1e","Type":"ContainerDied","Data":"0ca9e71065dce630b808e7bc5b992ab95e7313eea2afb89b56d8fd5f2b3bee47"} Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.488529 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-59fcc7f56d-krpcl" podStartSLOduration=4.488510352 podStartE2EDuration="4.488510352s" podCreationTimestamp="2025-12-16 15:16:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:34.487225802 +0000 UTC m=+1319.438304735" watchObservedRunningTime="2025-12-16 15:16:34.488510352 +0000 UTC m=+1319.439589275" Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.989127 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="0a2a6a81-f246-4963-bfcc-40d974860cd4" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.147:9292/healthcheck\": dial tcp 10.217.0.147:9292: connect: connection refused" Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.989167 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 15:16:34 crc kubenswrapper[4775]: I1216 15:16:34.989169 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="0a2a6a81-f246-4963-bfcc-40d974860cd4" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.147:9292/healthcheck\": dial tcp 10.217.0.147:9292: connect: connection refused" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.103025 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-combined-ca-bundle\") pod \"8a5c773c-3b58-405f-a31f-b4b872509e1e\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.103070 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-config-data\") pod \"8a5c773c-3b58-405f-a31f-b4b872509e1e\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.103108 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-scripts\") pod \"8a5c773c-3b58-405f-a31f-b4b872509e1e\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.103193 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxkpk\" (UniqueName: \"kubernetes.io/projected/8a5c773c-3b58-405f-a31f-b4b872509e1e-kube-api-access-cxkpk\") pod \"8a5c773c-3b58-405f-a31f-b4b872509e1e\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.103224 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a5c773c-3b58-405f-a31f-b4b872509e1e-etc-machine-id\") pod \"8a5c773c-3b58-405f-a31f-b4b872509e1e\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.103358 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-config-data-custom\") pod \"8a5c773c-3b58-405f-a31f-b4b872509e1e\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.103392 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a5c773c-3b58-405f-a31f-b4b872509e1e-logs\") pod \"8a5c773c-3b58-405f-a31f-b4b872509e1e\" (UID: \"8a5c773c-3b58-405f-a31f-b4b872509e1e\") " Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.104483 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a5c773c-3b58-405f-a31f-b4b872509e1e-logs" (OuterVolumeSpecName: "logs") pod "8a5c773c-3b58-405f-a31f-b4b872509e1e" (UID: "8a5c773c-3b58-405f-a31f-b4b872509e1e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.104526 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a5c773c-3b58-405f-a31f-b4b872509e1e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8a5c773c-3b58-405f-a31f-b4b872509e1e" (UID: "8a5c773c-3b58-405f-a31f-b4b872509e1e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.109803 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8a5c773c-3b58-405f-a31f-b4b872509e1e" (UID: "8a5c773c-3b58-405f-a31f-b4b872509e1e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.114736 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-scripts" (OuterVolumeSpecName: "scripts") pod "8a5c773c-3b58-405f-a31f-b4b872509e1e" (UID: "8a5c773c-3b58-405f-a31f-b4b872509e1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.128285 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a5c773c-3b58-405f-a31f-b4b872509e1e-kube-api-access-cxkpk" (OuterVolumeSpecName: "kube-api-access-cxkpk") pod "8a5c773c-3b58-405f-a31f-b4b872509e1e" (UID: "8a5c773c-3b58-405f-a31f-b4b872509e1e"). InnerVolumeSpecName "kube-api-access-cxkpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.139940 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a5c773c-3b58-405f-a31f-b4b872509e1e" (UID: "8a5c773c-3b58-405f-a31f-b4b872509e1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.186642 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-config-data" (OuterVolumeSpecName: "config-data") pod "8a5c773c-3b58-405f-a31f-b4b872509e1e" (UID: "8a5c773c-3b58-405f-a31f-b4b872509e1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.206261 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.206603 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a5c773c-3b58-405f-a31f-b4b872509e1e-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.206614 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.206623 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.206633 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a5c773c-3b58-405f-a31f-b4b872509e1e-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.206644 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxkpk\" (UniqueName: \"kubernetes.io/projected/8a5c773c-3b58-405f-a31f-b4b872509e1e-kube-api-access-cxkpk\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.206659 4775 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a5c773c-3b58-405f-a31f-b4b872509e1e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.373151 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b12d393-83d8-4db9-8aee-517105ff8484" path="/var/lib/kubelet/pods/4b12d393-83d8-4db9-8aee-517105ff8484/volumes" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.505360 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a5c773c-3b58-405f-a31f-b4b872509e1e","Type":"ContainerDied","Data":"10100b52842afe1614a745ae92d50c91a6f770ff8840991ae509167e987b18d1"} Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.505419 4775 scope.go:117] "RemoveContainer" containerID="0ca9e71065dce630b808e7bc5b992ab95e7313eea2afb89b56d8fd5f2b3bee47" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.505693 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.509958 4775 generic.go:334] "Generic (PLEG): container finished" podID="cae742d1-aa2b-4462-a2ca-e3b73d58d564" containerID="298cb301eb8afbe6b4e5205af068322e1c3698f9b518c92ec1229c8568754020" exitCode=0 Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.510021 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cae742d1-aa2b-4462-a2ca-e3b73d58d564","Type":"ContainerDied","Data":"298cb301eb8afbe6b4e5205af068322e1c3698f9b518c92ec1229c8568754020"} Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.538129 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.621475 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.632184 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 16 15:16:35 crc kubenswrapper[4775]: E1216 15:16:35.632688 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a5c773c-3b58-405f-a31f-b4b872509e1e" containerName="cinder-api-log" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.632710 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a5c773c-3b58-405f-a31f-b4b872509e1e" containerName="cinder-api-log" Dec 16 15:16:35 crc kubenswrapper[4775]: E1216 15:16:35.632742 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a5c773c-3b58-405f-a31f-b4b872509e1e" containerName="cinder-api" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.632751 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a5c773c-3b58-405f-a31f-b4b872509e1e" containerName="cinder-api" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.632972 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a5c773c-3b58-405f-a31f-b4b872509e1e" containerName="cinder-api-log" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.632997 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a5c773c-3b58-405f-a31f-b4b872509e1e" containerName="cinder-api" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.634365 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.640150 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.640342 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.640435 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.679804 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.736472 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg9dv\" (UniqueName: \"kubernetes.io/projected/521e4d08-bfb5-4043-bc0f-7515dbeb467f-kube-api-access-cg9dv\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.736571 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/521e4d08-bfb5-4043-bc0f-7515dbeb467f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.736598 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/521e4d08-bfb5-4043-bc0f-7515dbeb467f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.736630 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/521e4d08-bfb5-4043-bc0f-7515dbeb467f-scripts\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.736647 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/521e4d08-bfb5-4043-bc0f-7515dbeb467f-config-data-custom\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.736667 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/521e4d08-bfb5-4043-bc0f-7515dbeb467f-logs\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.736723 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/521e4d08-bfb5-4043-bc0f-7515dbeb467f-config-data\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.736746 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/521e4d08-bfb5-4043-bc0f-7515dbeb467f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.737075 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/521e4d08-bfb5-4043-bc0f-7515dbeb467f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.839584 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/521e4d08-bfb5-4043-bc0f-7515dbeb467f-config-data\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.839989 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/521e4d08-bfb5-4043-bc0f-7515dbeb467f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.840022 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/521e4d08-bfb5-4043-bc0f-7515dbeb467f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.840068 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg9dv\" (UniqueName: \"kubernetes.io/projected/521e4d08-bfb5-4043-bc0f-7515dbeb467f-kube-api-access-cg9dv\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.840131 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/521e4d08-bfb5-4043-bc0f-7515dbeb467f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.840151 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/521e4d08-bfb5-4043-bc0f-7515dbeb467f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.840182 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/521e4d08-bfb5-4043-bc0f-7515dbeb467f-scripts\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.840207 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/521e4d08-bfb5-4043-bc0f-7515dbeb467f-config-data-custom\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.840232 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/521e4d08-bfb5-4043-bc0f-7515dbeb467f-logs\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.840826 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/521e4d08-bfb5-4043-bc0f-7515dbeb467f-logs\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.842165 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/521e4d08-bfb5-4043-bc0f-7515dbeb467f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.846286 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/521e4d08-bfb5-4043-bc0f-7515dbeb467f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.851335 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/521e4d08-bfb5-4043-bc0f-7515dbeb467f-config-data\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.851644 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/521e4d08-bfb5-4043-bc0f-7515dbeb467f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.851878 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/521e4d08-bfb5-4043-bc0f-7515dbeb467f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.851988 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/521e4d08-bfb5-4043-bc0f-7515dbeb467f-config-data-custom\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.858823 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/521e4d08-bfb5-4043-bc0f-7515dbeb467f-scripts\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.863146 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg9dv\" (UniqueName: \"kubernetes.io/projected/521e4d08-bfb5-4043-bc0f-7515dbeb467f-kube-api-access-cg9dv\") pod \"cinder-api-0\" (UID: \"521e4d08-bfb5-4043-bc0f-7515dbeb467f\") " pod="openstack/cinder-api-0" Dec 16 15:16:35 crc kubenswrapper[4775]: I1216 15:16:35.961815 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.340498 4775 scope.go:117] "RemoveContainer" containerID="4634da530ff6bbd47268c50143df7998ac4625ef3df4f95e643d20e3724931a1" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.606322 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.612487 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-svtt9" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.613420 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0a2a6a81-f246-4963-bfcc-40d974860cd4","Type":"ContainerDied","Data":"ae2789a7181190ded83220910a176d533a2d2774be98ebfcbcf44a1cfdd326d4"} Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.613473 4775 scope.go:117] "RemoveContainer" containerID="e903035b3104b3f236a6ba889058883c4f7a80ef2bdebff8c661b0b0d43be75c" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.621395 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-svtt9" event={"ID":"27246978-4a11-4370-9635-71ef44e99b6c","Type":"ContainerDied","Data":"26eaad3fb57436593f69c443bd5820ca0b5a45c68a8b4f4d794e7019205fc99a"} Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.621435 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26eaad3fb57436593f69c443bd5820ca0b5a45c68a8b4f4d794e7019205fc99a" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.621512 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-svtt9" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.724003 4775 scope.go:117] "RemoveContainer" containerID="fe97ccdd2b524fba889177fa7b4be60d1aff4252728e422879dd3ed648a7e03f" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.762557 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27246978-4a11-4370-9635-71ef44e99b6c-operator-scripts\") pod \"27246978-4a11-4370-9635-71ef44e99b6c\" (UID: \"27246978-4a11-4370-9635-71ef44e99b6c\") " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.762841 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tzgv\" (UniqueName: \"kubernetes.io/projected/0a2a6a81-f246-4963-bfcc-40d974860cd4-kube-api-access-5tzgv\") pod \"0a2a6a81-f246-4963-bfcc-40d974860cd4\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.762878 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-combined-ca-bundle\") pod \"0a2a6a81-f246-4963-bfcc-40d974860cd4\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.762971 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a2a6a81-f246-4963-bfcc-40d974860cd4-httpd-run\") pod \"0a2a6a81-f246-4963-bfcc-40d974860cd4\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.763042 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a2a6a81-f246-4963-bfcc-40d974860cd4-logs\") pod \"0a2a6a81-f246-4963-bfcc-40d974860cd4\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.763094 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-public-tls-certs\") pod \"0a2a6a81-f246-4963-bfcc-40d974860cd4\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.763127 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"0a2a6a81-f246-4963-bfcc-40d974860cd4\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.763165 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-scripts\") pod \"0a2a6a81-f246-4963-bfcc-40d974860cd4\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.763187 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-config-data\") pod \"0a2a6a81-f246-4963-bfcc-40d974860cd4\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.763244 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2gmj\" (UniqueName: \"kubernetes.io/projected/27246978-4a11-4370-9635-71ef44e99b6c-kube-api-access-v2gmj\") pod \"27246978-4a11-4370-9635-71ef44e99b6c\" (UID: \"27246978-4a11-4370-9635-71ef44e99b6c\") " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.764185 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27246978-4a11-4370-9635-71ef44e99b6c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "27246978-4a11-4370-9635-71ef44e99b6c" (UID: "27246978-4a11-4370-9635-71ef44e99b6c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.776421 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a2a6a81-f246-4963-bfcc-40d974860cd4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0a2a6a81-f246-4963-bfcc-40d974860cd4" (UID: "0a2a6a81-f246-4963-bfcc-40d974860cd4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.776926 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a2a6a81-f246-4963-bfcc-40d974860cd4-logs" (OuterVolumeSpecName: "logs") pod "0a2a6a81-f246-4963-bfcc-40d974860cd4" (UID: "0a2a6a81-f246-4963-bfcc-40d974860cd4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.782473 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.804101 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a2a6a81-f246-4963-bfcc-40d974860cd4-kube-api-access-5tzgv" (OuterVolumeSpecName: "kube-api-access-5tzgv") pod "0a2a6a81-f246-4963-bfcc-40d974860cd4" (UID: "0a2a6a81-f246-4963-bfcc-40d974860cd4"). InnerVolumeSpecName "kube-api-access-5tzgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.804215 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "0a2a6a81-f246-4963-bfcc-40d974860cd4" (UID: "0a2a6a81-f246-4963-bfcc-40d974860cd4"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.804369 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27246978-4a11-4370-9635-71ef44e99b6c-kube-api-access-v2gmj" (OuterVolumeSpecName: "kube-api-access-v2gmj") pod "27246978-4a11-4370-9635-71ef44e99b6c" (UID: "27246978-4a11-4370-9635-71ef44e99b6c"). InnerVolumeSpecName "kube-api-access-v2gmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.808472 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-scripts" (OuterVolumeSpecName: "scripts") pod "0a2a6a81-f246-4963-bfcc-40d974860cd4" (UID: "0a2a6a81-f246-4963-bfcc-40d974860cd4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.846328 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.875717 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.875777 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.875791 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2gmj\" (UniqueName: \"kubernetes.io/projected/27246978-4a11-4370-9635-71ef44e99b6c-kube-api-access-v2gmj\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.875802 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27246978-4a11-4370-9635-71ef44e99b6c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.875815 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tzgv\" (UniqueName: \"kubernetes.io/projected/0a2a6a81-f246-4963-bfcc-40d974860cd4-kube-api-access-5tzgv\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.875826 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a2a6a81-f246-4963-bfcc-40d974860cd4-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.875835 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a2a6a81-f246-4963-bfcc-40d974860cd4-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.981310 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.981374 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-config-data\") pod \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.981457 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-combined-ca-bundle\") pod \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.981478 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-scripts\") pod \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.981505 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpzmr\" (UniqueName: \"kubernetes.io/projected/cae742d1-aa2b-4462-a2ca-e3b73d58d564-kube-api-access-fpzmr\") pod \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.981620 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cae742d1-aa2b-4462-a2ca-e3b73d58d564-httpd-run\") pod \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.981668 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-internal-tls-certs\") pod \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.982990 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cae742d1-aa2b-4462-a2ca-e3b73d58d564-logs\") pod \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.985744 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cae742d1-aa2b-4462-a2ca-e3b73d58d564-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cae742d1-aa2b-4462-a2ca-e3b73d58d564" (UID: "cae742d1-aa2b-4462-a2ca-e3b73d58d564"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:36 crc kubenswrapper[4775]: I1216 15:16:36.987784 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cae742d1-aa2b-4462-a2ca-e3b73d58d564-logs" (OuterVolumeSpecName: "logs") pod "cae742d1-aa2b-4462-a2ca-e3b73d58d564" (UID: "cae742d1-aa2b-4462-a2ca-e3b73d58d564"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.016965 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "cae742d1-aa2b-4462-a2ca-e3b73d58d564" (UID: "cae742d1-aa2b-4462-a2ca-e3b73d58d564"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.017696 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-scripts" (OuterVolumeSpecName: "scripts") pod "cae742d1-aa2b-4462-a2ca-e3b73d58d564" (UID: "cae742d1-aa2b-4462-a2ca-e3b73d58d564"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.021603 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae742d1-aa2b-4462-a2ca-e3b73d58d564-kube-api-access-fpzmr" (OuterVolumeSpecName: "kube-api-access-fpzmr") pod "cae742d1-aa2b-4462-a2ca-e3b73d58d564" (UID: "cae742d1-aa2b-4462-a2ca-e3b73d58d564"). InnerVolumeSpecName "kube-api-access-fpzmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.043260 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-55844f6789-qwjbq"] Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.088335 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cae742d1-aa2b-4462-a2ca-e3b73d58d564-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.088386 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.088400 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.088416 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpzmr\" (UniqueName: \"kubernetes.io/projected/cae742d1-aa2b-4462-a2ca-e3b73d58d564-kube-api-access-fpzmr\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.088429 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cae742d1-aa2b-4462-a2ca-e3b73d58d564-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.136871 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.189823 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.232041 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-69676fb7c9-tmm27"] Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.271307 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.376082 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a5c773c-3b58-405f-a31f-b4b872509e1e" path="/var/lib/kubelet/pods/8a5c773c-3b58-405f-a31f-b4b872509e1e/volumes" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.414278 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cae742d1-aa2b-4462-a2ca-e3b73d58d564" (UID: "cae742d1-aa2b-4462-a2ca-e3b73d58d564"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.414820 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-combined-ca-bundle\") pod \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\" (UID: \"cae742d1-aa2b-4462-a2ca-e3b73d58d564\") " Dec 16 15:16:37 crc kubenswrapper[4775]: W1216 15:16:37.414974 4775 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/cae742d1-aa2b-4462-a2ca-e3b73d58d564/volumes/kubernetes.io~secret/combined-ca-bundle Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.414992 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cae742d1-aa2b-4462-a2ca-e3b73d58d564" (UID: "cae742d1-aa2b-4462-a2ca-e3b73d58d564"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.415738 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.614001 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.622056 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.676252 4775 generic.go:334] "Generic (PLEG): container finished" podID="454beaa2-a30a-4b5f-bb64-95eafaa20360" containerID="50def315ca15d81b803f1d97fa51972c7bc6a3f2a28d7e9266064f84971336e6" exitCode=0 Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.724501 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cae742d1-aa2b-4462-a2ca-e3b73d58d564" (UID: "cae742d1-aa2b-4462-a2ca-e3b73d58d564"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.724919 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a2a6a81-f246-4963-bfcc-40d974860cd4" (UID: "0a2a6a81-f246-4963-bfcc-40d974860cd4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.725083 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-combined-ca-bundle\") pod \"0a2a6a81-f246-4963-bfcc-40d974860cd4\" (UID: \"0a2a6a81-f246-4963-bfcc-40d974860cd4\") " Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.726074 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:37 crc kubenswrapper[4775]: W1216 15:16:37.726713 4775 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/0a2a6a81-f246-4963-bfcc-40d974860cd4/volumes/kubernetes.io~secret/combined-ca-bundle Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.726732 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a2a6a81-f246-4963-bfcc-40d974860cd4" (UID: "0a2a6a81-f246-4963-bfcc-40d974860cd4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.757019 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" podStartSLOduration=4.121843864 podStartE2EDuration="7.756987801s" podCreationTimestamp="2025-12-16 15:16:30 +0000 UTC" firstStartedPulling="2025-12-16 15:16:32.797678777 +0000 UTC m=+1317.748757700" lastFinishedPulling="2025-12-16 15:16:36.432822714 +0000 UTC m=+1321.383901637" observedRunningTime="2025-12-16 15:16:37.748375439 +0000 UTC m=+1322.699454362" watchObservedRunningTime="2025-12-16 15:16:37.756987801 +0000 UTC m=+1322.708066764" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.789036 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0a2a6a81-f246-4963-bfcc-40d974860cd4" (UID: "0a2a6a81-f246-4963-bfcc-40d974860cd4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.829977 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.830009 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.834797 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-8vlhw" podStartSLOduration=5.834772025 podStartE2EDuration="5.834772025s" podCreationTimestamp="2025-12-16 15:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:37.819245665 +0000 UTC m=+1322.770324588" watchObservedRunningTime="2025-12-16 15:16:37.834772025 +0000 UTC m=+1322.785850948" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.870242 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.873320 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-41a9-account-create-update-fnj2g" podStartSLOduration=5.87329542 podStartE2EDuration="5.87329542s" podCreationTimestamp="2025-12-16 15:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:37.860177517 +0000 UTC m=+1322.811256450" watchObservedRunningTime="2025-12-16 15:16:37.87329542 +0000 UTC m=+1322.824374343" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.902268 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-config-data" (OuterVolumeSpecName: "config-data") pod "cae742d1-aa2b-4462-a2ca-e3b73d58d564" (UID: "cae742d1-aa2b-4462-a2ca-e3b73d58d564"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.909906 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.928434 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-config-data" (OuterVolumeSpecName: "config-data") pod "0a2a6a81-f246-4963-bfcc-40d974860cd4" (UID: "0a2a6a81-f246-4963-bfcc-40d974860cd4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.933218 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a2a6a81-f246-4963-bfcc-40d974860cd4-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:37 crc kubenswrapper[4775]: I1216 15:16:37.933255 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae742d1-aa2b-4462-a2ca-e3b73d58d564-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.062933 4775 generic.go:334] "Generic (PLEG): container finished" podID="b5c39639-d2db-4abd-a474-d141a0d0af35" containerID="e78dc26c846c00548fc65746fd5409d68567cc52037fcba64378d9ad4be2b989" exitCode=0 Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.070969 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-9435-account-create-update-td6k7" podStartSLOduration=6.070943296 podStartE2EDuration="6.070943296s" podCreationTimestamp="2025-12-16 15:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:38.067199368 +0000 UTC m=+1323.018278321" watchObservedRunningTime="2025-12-16 15:16:38.070943296 +0000 UTC m=+1323.022022229" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.271283 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.271324 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-68dw8" event={"ID":"454beaa2-a30a-4b5f-bb64-95eafaa20360","Type":"ContainerDied","Data":"50def315ca15d81b803f1d97fa51972c7bc6a3f2a28d7e9266064f84971336e6"} Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.271352 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" event={"ID":"402ffd59-e84f-4e09-9d8d-d89c6c788547","Type":"ContainerStarted","Data":"3a16dd91bf66af58e44a5f8b807cb8ff89cbd54fe8effe39547eb3354e03a7f6"} Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.271372 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8vlhw" event={"ID":"382550fb-c9fc-4100-a196-8ab11975d0ad","Type":"ContainerStarted","Data":"56addecc1e52f86ca9007267a1f53cb04b8bc81de1ab3438dd6b3916e668a6df"} Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.271383 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-41a9-account-create-update-fnj2g" event={"ID":"a87cdc28-aa31-4446-a1a6-e0904f9daa62","Type":"ContainerStarted","Data":"d803eff0ab69221595095bfe69b917868b310e315085c0ba307c7dc6f7ad4058"} Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.271393 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cae742d1-aa2b-4462-a2ca-e3b73d58d564","Type":"ContainerDied","Data":"728fa0f2c920d936875865881d767b8119712f216c11434ac8170a432742f78c"} Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.271410 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-69676fb7c9-tmm27" event={"ID":"d9a8d05d-1353-46db-9367-c7205a7d39d9","Type":"ContainerStarted","Data":"4fb1656307aeb496e2b12ed176161404b9283a286eb387dbb6a5f89ab1d36e4b"} Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.271420 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"521e4d08-bfb5-4043-bc0f-7515dbeb467f","Type":"ContainerStarted","Data":"a23c52b473c46bc6a31bea40571e5510e3fdc205aba6529382bf95d0406b7e96"} Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.271432 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3399b2a-e8f1-442d-be74-160e5524608b","Type":"ContainerStarted","Data":"2c4e454f47a2dd2a0188abb4d911f6e5b2a23a7f5320851325bdf771093adf80"} Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.271443 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9435-account-create-update-td6k7" event={"ID":"c18fb7c5-60d7-497d-8031-4d3c073104a6","Type":"ContainerStarted","Data":"82de921d2b8a698c8cdd1fc8123013c7b791c624d3bc666653375f4a26a7a447"} Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.271453 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-55844f6789-qwjbq" event={"ID":"ea336475-3963-43eb-9e16-814d0c717625","Type":"ContainerStarted","Data":"b4a3ada540efc998a746a431266951af7491839e00b16f2632272fafe08dbe9d"} Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.271464 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-52a3-account-create-update-zq6jv" event={"ID":"b5c39639-d2db-4abd-a474-d141a0d0af35","Type":"ContainerDied","Data":"e78dc26c846c00548fc65746fd5409d68567cc52037fcba64378d9ad4be2b989"} Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.272121 4775 scope.go:117] "RemoveContainer" containerID="298cb301eb8afbe6b4e5205af068322e1c3698f9b518c92ec1229c8568754020" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.366721 4775 scope.go:117] "RemoveContainer" containerID="2ba717144947e6116cbb8342108d99471e1beb3f36cdefc8beba61b3c7ecf227" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.378011 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.410973 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.432217 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.443558 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.460156 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:16:38 crc kubenswrapper[4775]: E1216 15:16:38.461040 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae742d1-aa2b-4462-a2ca-e3b73d58d564" containerName="glance-httpd" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.461062 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae742d1-aa2b-4462-a2ca-e3b73d58d564" containerName="glance-httpd" Dec 16 15:16:38 crc kubenswrapper[4775]: E1216 15:16:38.461088 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2a6a81-f246-4963-bfcc-40d974860cd4" containerName="glance-httpd" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.461122 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2a6a81-f246-4963-bfcc-40d974860cd4" containerName="glance-httpd" Dec 16 15:16:38 crc kubenswrapper[4775]: E1216 15:16:38.461147 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27246978-4a11-4370-9635-71ef44e99b6c" containerName="mariadb-database-create" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.461158 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="27246978-4a11-4370-9635-71ef44e99b6c" containerName="mariadb-database-create" Dec 16 15:16:38 crc kubenswrapper[4775]: E1216 15:16:38.461169 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae742d1-aa2b-4462-a2ca-e3b73d58d564" containerName="glance-log" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.461177 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae742d1-aa2b-4462-a2ca-e3b73d58d564" containerName="glance-log" Dec 16 15:16:38 crc kubenswrapper[4775]: E1216 15:16:38.461229 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2a6a81-f246-4963-bfcc-40d974860cd4" containerName="glance-log" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.461239 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2a6a81-f246-4963-bfcc-40d974860cd4" containerName="glance-log" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.461619 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a2a6a81-f246-4963-bfcc-40d974860cd4" containerName="glance-httpd" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.461646 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="27246978-4a11-4370-9635-71ef44e99b6c" containerName="mariadb-database-create" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.461659 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a2a6a81-f246-4963-bfcc-40d974860cd4" containerName="glance-log" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.461709 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae742d1-aa2b-4462-a2ca-e3b73d58d564" containerName="glance-log" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.461723 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae742d1-aa2b-4462-a2ca-e3b73d58d564" containerName="glance-httpd" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.468610 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.476113 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lzhbs" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.476481 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.476696 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.476961 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.482631 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.484768 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.489438 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.490376 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.516354 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.563570 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.652834 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e61daa0-8bea-4632-8936-5fb68d555ab1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.652878 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7732fee4-0518-41db-be31-b9c7ae4aca6b-scripts\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.652912 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7732fee4-0518-41db-be31-b9c7ae4aca6b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.652955 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e61daa0-8bea-4632-8936-5fb68d555ab1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.652982 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7732fee4-0518-41db-be31-b9c7ae4aca6b-logs\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.653007 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e61daa0-8bea-4632-8936-5fb68d555ab1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.653056 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e61daa0-8bea-4632-8936-5fb68d555ab1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.653081 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.653099 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e61daa0-8bea-4632-8936-5fb68d555ab1-logs\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.653119 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e61daa0-8bea-4632-8936-5fb68d555ab1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.653138 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7732fee4-0518-41db-be31-b9c7ae4aca6b-config-data\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.653169 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.653193 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7732fee4-0518-41db-be31-b9c7ae4aca6b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.653210 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfbtg\" (UniqueName: \"kubernetes.io/projected/0e61daa0-8bea-4632-8936-5fb68d555ab1-kube-api-access-rfbtg\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.653236 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdcnb\" (UniqueName: \"kubernetes.io/projected/7732fee4-0518-41db-be31-b9c7ae4aca6b-kube-api-access-fdcnb\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.653264 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7732fee4-0518-41db-be31-b9c7ae4aca6b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.754566 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.754614 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7732fee4-0518-41db-be31-b9c7ae4aca6b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.754639 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfbtg\" (UniqueName: \"kubernetes.io/projected/0e61daa0-8bea-4632-8936-5fb68d555ab1-kube-api-access-rfbtg\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.754668 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdcnb\" (UniqueName: \"kubernetes.io/projected/7732fee4-0518-41db-be31-b9c7ae4aca6b-kube-api-access-fdcnb\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.754699 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7732fee4-0518-41db-be31-b9c7ae4aca6b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.754726 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e61daa0-8bea-4632-8936-5fb68d555ab1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.754742 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7732fee4-0518-41db-be31-b9c7ae4aca6b-scripts\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.754756 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7732fee4-0518-41db-be31-b9c7ae4aca6b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.754791 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e61daa0-8bea-4632-8936-5fb68d555ab1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.754815 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7732fee4-0518-41db-be31-b9c7ae4aca6b-logs\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.754846 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e61daa0-8bea-4632-8936-5fb68d555ab1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.754916 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e61daa0-8bea-4632-8936-5fb68d555ab1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.754936 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.754952 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e61daa0-8bea-4632-8936-5fb68d555ab1-logs\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.754971 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e61daa0-8bea-4632-8936-5fb68d555ab1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.754990 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7732fee4-0518-41db-be31-b9c7ae4aca6b-config-data\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.755761 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.757714 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e61daa0-8bea-4632-8936-5fb68d555ab1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.758199 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e61daa0-8bea-4632-8936-5fb68d555ab1-logs\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.759417 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.759686 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7732fee4-0518-41db-be31-b9c7ae4aca6b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.760009 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7732fee4-0518-41db-be31-b9c7ae4aca6b-logs\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.760568 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e61daa0-8bea-4632-8936-5fb68d555ab1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.765923 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7732fee4-0518-41db-be31-b9c7ae4aca6b-config-data\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.783196 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7732fee4-0518-41db-be31-b9c7ae4aca6b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.783298 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e61daa0-8bea-4632-8936-5fb68d555ab1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.783513 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7732fee4-0518-41db-be31-b9c7ae4aca6b-scripts\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.785393 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e61daa0-8bea-4632-8936-5fb68d555ab1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.786446 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7732fee4-0518-41db-be31-b9c7ae4aca6b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.786867 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e61daa0-8bea-4632-8936-5fb68d555ab1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.840126 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdcnb\" (UniqueName: \"kubernetes.io/projected/7732fee4-0518-41db-be31-b9c7ae4aca6b-kube-api-access-fdcnb\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.860706 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfbtg\" (UniqueName: \"kubernetes.io/projected/0e61daa0-8bea-4632-8936-5fb68d555ab1-kube-api-access-rfbtg\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.928382 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"0e61daa0-8bea-4632-8936-5fb68d555ab1\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:38 crc kubenswrapper[4775]: I1216 15:16:38.951664 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7732fee4-0518-41db-be31-b9c7ae4aca6b\") " pod="openstack/glance-default-external-api-0" Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.083790 4775 generic.go:334] "Generic (PLEG): container finished" podID="a87cdc28-aa31-4446-a1a6-e0904f9daa62" containerID="d803eff0ab69221595095bfe69b917868b310e315085c0ba307c7dc6f7ad4058" exitCode=0 Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.083850 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-41a9-account-create-update-fnj2g" event={"ID":"a87cdc28-aa31-4446-a1a6-e0904f9daa62","Type":"ContainerDied","Data":"d803eff0ab69221595095bfe69b917868b310e315085c0ba307c7dc6f7ad4058"} Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.091169 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-69676fb7c9-tmm27" event={"ID":"d9a8d05d-1353-46db-9367-c7205a7d39d9","Type":"ContainerStarted","Data":"1edbb10ecf3502a98a46b50ba20504681e37b6fba5799d5e2b1cd0c79ea01014"} Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.091256 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.093285 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-65646f4f55-j22ds" event={"ID":"76beac71-bf66-45ec-8a1f-5f6ed8122888","Type":"ContainerStarted","Data":"7b8212cb594a5b8536189bbaf23f255279bcdcd2f1ef535fd2dc115dc6a2dccf"} Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.093434 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-65646f4f55-j22ds" podUID="76beac71-bf66-45ec-8a1f-5f6ed8122888" containerName="heat-cfnapi" containerID="cri-o://7b8212cb594a5b8536189bbaf23f255279bcdcd2f1ef535fd2dc115dc6a2dccf" gracePeriod=60 Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.093741 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-65646f4f55-j22ds" Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.114374 4775 generic.go:334] "Generic (PLEG): container finished" podID="c18fb7c5-60d7-497d-8031-4d3c073104a6" containerID="82de921d2b8a698c8cdd1fc8123013c7b791c624d3bc666653375f4a26a7a447" exitCode=0 Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.114601 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9435-account-create-update-td6k7" event={"ID":"c18fb7c5-60d7-497d-8031-4d3c073104a6","Type":"ContainerDied","Data":"82de921d2b8a698c8cdd1fc8123013c7b791c624d3bc666653375f4a26a7a447"} Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.125258 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-55844f6789-qwjbq" event={"ID":"ea336475-3963-43eb-9e16-814d0c717625","Type":"ContainerStarted","Data":"ea11413bd2de5ccac4d6350d0fda10401e8a024aa0b8969268ead4d15112b947"} Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.126073 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.135429 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.141235 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-69676fb7c9-tmm27" podStartSLOduration=6.141213262 podStartE2EDuration="6.141213262s" podCreationTimestamp="2025-12-16 15:16:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:39.1259162 +0000 UTC m=+1324.076995123" watchObservedRunningTime="2025-12-16 15:16:39.141213262 +0000 UTC m=+1324.092292185" Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.146986 4775 generic.go:334] "Generic (PLEG): container finished" podID="2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d" containerID="64ee8494cd8b2f7293109016ab3329239eba909a6f402b28d52f228ac6865b5a" exitCode=1 Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.147127 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-76487b4cc4-bpbgr" event={"ID":"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d","Type":"ContainerDied","Data":"64ee8494cd8b2f7293109016ab3329239eba909a6f402b28d52f228ac6865b5a"} Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.147765 4775 scope.go:117] "RemoveContainer" containerID="64ee8494cd8b2f7293109016ab3329239eba909a6f402b28d52f228ac6865b5a" Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.162394 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-78c456ddf7-dbpj2" event={"ID":"e42b8d55-b8c7-4982-9cef-714706199d4a","Type":"ContainerStarted","Data":"6b76060f99541f01e9151fd0b91e13f69bb8e5c7d73c1928f85fd943513c9c2c"} Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.162641 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-78c456ddf7-dbpj2" podUID="e42b8d55-b8c7-4982-9cef-714706199d4a" containerName="heat-api" containerID="cri-o://6b76060f99541f01e9151fd0b91e13f69bb8e5c7d73c1928f85fd943513c9c2c" gracePeriod=60 Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.162870 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-78c456ddf7-dbpj2" Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.163436 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.175468 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-65646f4f55-j22ds" podStartSLOduration=11.249551864 podStartE2EDuration="16.175452173s" podCreationTimestamp="2025-12-16 15:16:23 +0000 UTC" firstStartedPulling="2025-12-16 15:16:31.553462133 +0000 UTC m=+1316.504541056" lastFinishedPulling="2025-12-16 15:16:36.479362442 +0000 UTC m=+1321.430441365" observedRunningTime="2025-12-16 15:16:39.146395106 +0000 UTC m=+1324.097474029" watchObservedRunningTime="2025-12-16 15:16:39.175452173 +0000 UTC m=+1324.126531096" Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.182613 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"521e4d08-bfb5-4043-bc0f-7515dbeb467f","Type":"ContainerStarted","Data":"29ccabc78b493d4cb7ef9e781e1ea7c224c18d6e99314d873412d931aa77572b"} Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.193090 4775 generic.go:334] "Generic (PLEG): container finished" podID="402ffd59-e84f-4e09-9d8d-d89c6c788547" containerID="3a16dd91bf66af58e44a5f8b807cb8ff89cbd54fe8effe39547eb3354e03a7f6" exitCode=1 Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.193179 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" event={"ID":"402ffd59-e84f-4e09-9d8d-d89c6c788547","Type":"ContainerDied","Data":"3a16dd91bf66af58e44a5f8b807cb8ff89cbd54fe8effe39547eb3354e03a7f6"} Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.193915 4775 scope.go:117] "RemoveContainer" containerID="3a16dd91bf66af58e44a5f8b807cb8ff89cbd54fe8effe39547eb3354e03a7f6" Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.212731 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-55844f6789-qwjbq" podStartSLOduration=6.212705498 podStartE2EDuration="6.212705498s" podCreationTimestamp="2025-12-16 15:16:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:39.182908298 +0000 UTC m=+1324.133987221" watchObservedRunningTime="2025-12-16 15:16:39.212705498 +0000 UTC m=+1324.163784411" Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.224232 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-78c456ddf7-dbpj2" podStartSLOduration=11.326750728 podStartE2EDuration="16.224208421s" podCreationTimestamp="2025-12-16 15:16:23 +0000 UTC" firstStartedPulling="2025-12-16 15:16:31.557129688 +0000 UTC m=+1316.508208611" lastFinishedPulling="2025-12-16 15:16:36.454587381 +0000 UTC m=+1321.405666304" observedRunningTime="2025-12-16 15:16:39.206701919 +0000 UTC m=+1324.157780842" watchObservedRunningTime="2025-12-16 15:16:39.224208421 +0000 UTC m=+1324.175287344" Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.253921 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3399b2a-e8f1-442d-be74-160e5524608b","Type":"ContainerStarted","Data":"edd8a8fdc8eb347f32ec09584f5f673b84dc2a4af5db33fd67bbb0286de9deb4"} Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.266196 4775 generic.go:334] "Generic (PLEG): container finished" podID="382550fb-c9fc-4100-a196-8ab11975d0ad" containerID="56addecc1e52f86ca9007267a1f53cb04b8bc81de1ab3438dd6b3916e668a6df" exitCode=0 Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.266402 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8vlhw" event={"ID":"382550fb-c9fc-4100-a196-8ab11975d0ad","Type":"ContainerDied","Data":"56addecc1e52f86ca9007267a1f53cb04b8bc81de1ab3438dd6b3916e668a6df"} Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.268017 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.382415 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a2a6a81-f246-4963-bfcc-40d974860cd4" path="/var/lib/kubelet/pods/0a2a6a81-f246-4963-bfcc-40d974860cd4/volumes" Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.383834 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae742d1-aa2b-4462-a2ca-e3b73d58d564" path="/var/lib/kubelet/pods/cae742d1-aa2b-4462-a2ca-e3b73d58d564/volumes" Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.384594 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-2xd25"] Dec 16 15:16:39 crc kubenswrapper[4775]: I1216 15:16:39.384998 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" podUID="ae6daf4a-5550-44e9-a0bd-11bc6527ad5d" containerName="dnsmasq-dns" containerID="cri-o://93d9517139b3988374248ab712d91f2f749fed150a9bd8239d3c3e9608d0978a" gracePeriod=10 Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.294544 4775 generic.go:334] "Generic (PLEG): container finished" podID="76beac71-bf66-45ec-8a1f-5f6ed8122888" containerID="7b8212cb594a5b8536189bbaf23f255279bcdcd2f1ef535fd2dc115dc6a2dccf" exitCode=0 Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.295290 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-65646f4f55-j22ds" event={"ID":"76beac71-bf66-45ec-8a1f-5f6ed8122888","Type":"ContainerDied","Data":"7b8212cb594a5b8536189bbaf23f255279bcdcd2f1ef535fd2dc115dc6a2dccf"} Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.295551 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-65646f4f55-j22ds" event={"ID":"76beac71-bf66-45ec-8a1f-5f6ed8122888","Type":"ContainerDied","Data":"fcf168b3bd8082fdb2c3280cddc8f76ef170ba21bfb6fbece69fe031662c4f40"} Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.295568 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcf168b3bd8082fdb2c3280cddc8f76ef170ba21bfb6fbece69fe031662c4f40" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.303218 4775 generic.go:334] "Generic (PLEG): container finished" podID="402ffd59-e84f-4e09-9d8d-d89c6c788547" containerID="5ab773b0a14bc1a03f2a838c08d2cdd2505d2f815f150d57b5edea996c70cf7a" exitCode=1 Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.303296 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" event={"ID":"402ffd59-e84f-4e09-9d8d-d89c6c788547","Type":"ContainerDied","Data":"5ab773b0a14bc1a03f2a838c08d2cdd2505d2f815f150d57b5edea996c70cf7a"} Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.303345 4775 scope.go:117] "RemoveContainer" containerID="3a16dd91bf66af58e44a5f8b807cb8ff89cbd54fe8effe39547eb3354e03a7f6" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.304114 4775 scope.go:117] "RemoveContainer" containerID="5ab773b0a14bc1a03f2a838c08d2cdd2505d2f815f150d57b5edea996c70cf7a" Dec 16 15:16:40 crc kubenswrapper[4775]: E1216 15:16:40.304477 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-59c7c5dfbf-8495r_openstack(402ffd59-e84f-4e09-9d8d-d89c6c788547)\"" pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" podUID="402ffd59-e84f-4e09-9d8d-d89c6c788547" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.323685 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-52a3-account-create-update-zq6jv" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.325846 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3399b2a-e8f1-442d-be74-160e5524608b","Type":"ContainerStarted","Data":"93ac49b7de24427eb8d6e5965ec6d3292b7d7de1711710d6380d12dd02573032"} Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.336868 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.339194 4775 generic.go:334] "Generic (PLEG): container finished" podID="ae6daf4a-5550-44e9-a0bd-11bc6527ad5d" containerID="93d9517139b3988374248ab712d91f2f749fed150a9bd8239d3c3e9608d0978a" exitCode=0 Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.339261 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" event={"ID":"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d","Type":"ContainerDied","Data":"93d9517139b3988374248ab712d91f2f749fed150a9bd8239d3c3e9608d0978a"} Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.339286 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" event={"ID":"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d","Type":"ContainerDied","Data":"96a7479c014ec9840606b8bf2adf34b2116c96c95f7e85bce69f5a4d7639f887"} Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.343071 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-52a3-account-create-update-zq6jv" event={"ID":"b5c39639-d2db-4abd-a474-d141a0d0af35","Type":"ContainerDied","Data":"ef48cf579301854bfb12b47d907a4bd3e91bb4f1fa55a1f39bbfd5df66060956"} Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.343123 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef48cf579301854bfb12b47d907a4bd3e91bb4f1fa55a1f39bbfd5df66060956" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.343199 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-52a3-account-create-update-zq6jv" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.346812 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-68dw8" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.368317 4775 generic.go:334] "Generic (PLEG): container finished" podID="2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d" containerID="c00d1eca1370ec628d48f9aa288a5ce69131eb1bd061b18df84d2c6371b539b1" exitCode=1 Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.368585 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-76487b4cc4-bpbgr" event={"ID":"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d","Type":"ContainerDied","Data":"c00d1eca1370ec628d48f9aa288a5ce69131eb1bd061b18df84d2c6371b539b1"} Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.369244 4775 scope.go:117] "RemoveContainer" containerID="c00d1eca1370ec628d48f9aa288a5ce69131eb1bd061b18df84d2c6371b539b1" Dec 16 15:16:40 crc kubenswrapper[4775]: E1216 15:16:40.369477 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-76487b4cc4-bpbgr_openstack(2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d)\"" pod="openstack/heat-api-76487b4cc4-bpbgr" podUID="2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.372228 4775 generic.go:334] "Generic (PLEG): container finished" podID="e42b8d55-b8c7-4982-9cef-714706199d4a" containerID="6b76060f99541f01e9151fd0b91e13f69bb8e5c7d73c1928f85fd943513c9c2c" exitCode=0 Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.372280 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-78c456ddf7-dbpj2" event={"ID":"e42b8d55-b8c7-4982-9cef-714706199d4a","Type":"ContainerDied","Data":"6b76060f99541f01e9151fd0b91e13f69bb8e5c7d73c1928f85fd943513c9c2c"} Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.373775 4775 scope.go:117] "RemoveContainer" containerID="93d9517139b3988374248ab712d91f2f749fed150a9bd8239d3c3e9608d0978a" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.373991 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-65646f4f55-j22ds" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.375689 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-68dw8" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.376123 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-68dw8" event={"ID":"454beaa2-a30a-4b5f-bb64-95eafaa20360","Type":"ContainerDied","Data":"af4ae4b5be928f27149a43322fded1feb8100d688b9881855a3e88b4f6398880"} Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.376147 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af4ae4b5be928f27149a43322fded1feb8100d688b9881855a3e88b4f6398880" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.408573 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-config\") pod \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.408636 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76beac71-bf66-45ec-8a1f-5f6ed8122888-config-data-custom\") pod \"76beac71-bf66-45ec-8a1f-5f6ed8122888\" (UID: \"76beac71-bf66-45ec-8a1f-5f6ed8122888\") " Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.408730 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5c39639-d2db-4abd-a474-d141a0d0af35-operator-scripts\") pod \"b5c39639-d2db-4abd-a474-d141a0d0af35\" (UID: \"b5c39639-d2db-4abd-a474-d141a0d0af35\") " Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.408749 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vwnd\" (UniqueName: \"kubernetes.io/projected/76beac71-bf66-45ec-8a1f-5f6ed8122888-kube-api-access-7vwnd\") pod \"76beac71-bf66-45ec-8a1f-5f6ed8122888\" (UID: \"76beac71-bf66-45ec-8a1f-5f6ed8122888\") " Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.408773 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-dns-svc\") pod \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.408817 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5bxr\" (UniqueName: \"kubernetes.io/projected/454beaa2-a30a-4b5f-bb64-95eafaa20360-kube-api-access-f5bxr\") pod \"454beaa2-a30a-4b5f-bb64-95eafaa20360\" (UID: \"454beaa2-a30a-4b5f-bb64-95eafaa20360\") " Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.408833 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76beac71-bf66-45ec-8a1f-5f6ed8122888-config-data\") pod \"76beac71-bf66-45ec-8a1f-5f6ed8122888\" (UID: \"76beac71-bf66-45ec-8a1f-5f6ed8122888\") " Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.408880 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mglgp\" (UniqueName: \"kubernetes.io/projected/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-kube-api-access-mglgp\") pod \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.408927 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76beac71-bf66-45ec-8a1f-5f6ed8122888-combined-ca-bundle\") pod \"76beac71-bf66-45ec-8a1f-5f6ed8122888\" (UID: \"76beac71-bf66-45ec-8a1f-5f6ed8122888\") " Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.408961 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-ovsdbserver-nb\") pod \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.408992 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/454beaa2-a30a-4b5f-bb64-95eafaa20360-operator-scripts\") pod \"454beaa2-a30a-4b5f-bb64-95eafaa20360\" (UID: \"454beaa2-a30a-4b5f-bb64-95eafaa20360\") " Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.409025 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-ovsdbserver-sb\") pod \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.409065 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-dns-swift-storage-0\") pod \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\" (UID: \"ae6daf4a-5550-44e9-a0bd-11bc6527ad5d\") " Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.409081 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mfp2\" (UniqueName: \"kubernetes.io/projected/b5c39639-d2db-4abd-a474-d141a0d0af35-kube-api-access-6mfp2\") pod \"b5c39639-d2db-4abd-a474-d141a0d0af35\" (UID: \"b5c39639-d2db-4abd-a474-d141a0d0af35\") " Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.418051 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/454beaa2-a30a-4b5f-bb64-95eafaa20360-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "454beaa2-a30a-4b5f-bb64-95eafaa20360" (UID: "454beaa2-a30a-4b5f-bb64-95eafaa20360"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.420943 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5c39639-d2db-4abd-a474-d141a0d0af35-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5c39639-d2db-4abd-a474-d141a0d0af35" (UID: "b5c39639-d2db-4abd-a474-d141a0d0af35"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.424847 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5c39639-d2db-4abd-a474-d141a0d0af35-kube-api-access-6mfp2" (OuterVolumeSpecName: "kube-api-access-6mfp2") pod "b5c39639-d2db-4abd-a474-d141a0d0af35" (UID: "b5c39639-d2db-4abd-a474-d141a0d0af35"). InnerVolumeSpecName "kube-api-access-6mfp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.428083 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76beac71-bf66-45ec-8a1f-5f6ed8122888-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "76beac71-bf66-45ec-8a1f-5f6ed8122888" (UID: "76beac71-bf66-45ec-8a1f-5f6ed8122888"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.428451 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-kube-api-access-mglgp" (OuterVolumeSpecName: "kube-api-access-mglgp") pod "ae6daf4a-5550-44e9-a0bd-11bc6527ad5d" (UID: "ae6daf4a-5550-44e9-a0bd-11bc6527ad5d"). InnerVolumeSpecName "kube-api-access-mglgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.433021 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/454beaa2-a30a-4b5f-bb64-95eafaa20360-kube-api-access-f5bxr" (OuterVolumeSpecName: "kube-api-access-f5bxr") pod "454beaa2-a30a-4b5f-bb64-95eafaa20360" (UID: "454beaa2-a30a-4b5f-bb64-95eafaa20360"). InnerVolumeSpecName "kube-api-access-f5bxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.434799 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76beac71-bf66-45ec-8a1f-5f6ed8122888-kube-api-access-7vwnd" (OuterVolumeSpecName: "kube-api-access-7vwnd") pod "76beac71-bf66-45ec-8a1f-5f6ed8122888" (UID: "76beac71-bf66-45ec-8a1f-5f6ed8122888"). InnerVolumeSpecName "kube-api-access-7vwnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.446462 4775 scope.go:117] "RemoveContainer" containerID="83b1d8fcd134c9654aa43e1ac362b72dd4ae4d8f06f5703b7cc23d95b9c668a9" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.493272 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76beac71-bf66-45ec-8a1f-5f6ed8122888-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76beac71-bf66-45ec-8a1f-5f6ed8122888" (UID: "76beac71-bf66-45ec-8a1f-5f6ed8122888"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.511751 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mfp2\" (UniqueName: \"kubernetes.io/projected/b5c39639-d2db-4abd-a474-d141a0d0af35-kube-api-access-6mfp2\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.511774 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76beac71-bf66-45ec-8a1f-5f6ed8122888-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.511786 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5c39639-d2db-4abd-a474-d141a0d0af35-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.511796 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vwnd\" (UniqueName: \"kubernetes.io/projected/76beac71-bf66-45ec-8a1f-5f6ed8122888-kube-api-access-7vwnd\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.511808 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5bxr\" (UniqueName: \"kubernetes.io/projected/454beaa2-a30a-4b5f-bb64-95eafaa20360-kube-api-access-f5bxr\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.511818 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mglgp\" (UniqueName: \"kubernetes.io/projected/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-kube-api-access-mglgp\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.511826 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76beac71-bf66-45ec-8a1f-5f6ed8122888-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.511834 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/454beaa2-a30a-4b5f-bb64-95eafaa20360-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.541656 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ae6daf4a-5550-44e9-a0bd-11bc6527ad5d" (UID: "ae6daf4a-5550-44e9-a0bd-11bc6527ad5d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.547973 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ae6daf4a-5550-44e9-a0bd-11bc6527ad5d" (UID: "ae6daf4a-5550-44e9-a0bd-11bc6527ad5d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.557501 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ae6daf4a-5550-44e9-a0bd-11bc6527ad5d" (UID: "ae6daf4a-5550-44e9-a0bd-11bc6527ad5d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.569533 4775 scope.go:117] "RemoveContainer" containerID="93d9517139b3988374248ab712d91f2f749fed150a9bd8239d3c3e9608d0978a" Dec 16 15:16:40 crc kubenswrapper[4775]: E1216 15:16:40.570991 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93d9517139b3988374248ab712d91f2f749fed150a9bd8239d3c3e9608d0978a\": container with ID starting with 93d9517139b3988374248ab712d91f2f749fed150a9bd8239d3c3e9608d0978a not found: ID does not exist" containerID="93d9517139b3988374248ab712d91f2f749fed150a9bd8239d3c3e9608d0978a" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.571021 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93d9517139b3988374248ab712d91f2f749fed150a9bd8239d3c3e9608d0978a"} err="failed to get container status \"93d9517139b3988374248ab712d91f2f749fed150a9bd8239d3c3e9608d0978a\": rpc error: code = NotFound desc = could not find container \"93d9517139b3988374248ab712d91f2f749fed150a9bd8239d3c3e9608d0978a\": container with ID starting with 93d9517139b3988374248ab712d91f2f749fed150a9bd8239d3c3e9608d0978a not found: ID does not exist" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.571043 4775 scope.go:117] "RemoveContainer" containerID="83b1d8fcd134c9654aa43e1ac362b72dd4ae4d8f06f5703b7cc23d95b9c668a9" Dec 16 15:16:40 crc kubenswrapper[4775]: E1216 15:16:40.571748 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83b1d8fcd134c9654aa43e1ac362b72dd4ae4d8f06f5703b7cc23d95b9c668a9\": container with ID starting with 83b1d8fcd134c9654aa43e1ac362b72dd4ae4d8f06f5703b7cc23d95b9c668a9 not found: ID does not exist" containerID="83b1d8fcd134c9654aa43e1ac362b72dd4ae4d8f06f5703b7cc23d95b9c668a9" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.571791 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83b1d8fcd134c9654aa43e1ac362b72dd4ae4d8f06f5703b7cc23d95b9c668a9"} err="failed to get container status \"83b1d8fcd134c9654aa43e1ac362b72dd4ae4d8f06f5703b7cc23d95b9c668a9\": rpc error: code = NotFound desc = could not find container \"83b1d8fcd134c9654aa43e1ac362b72dd4ae4d8f06f5703b7cc23d95b9c668a9\": container with ID starting with 83b1d8fcd134c9654aa43e1ac362b72dd4ae4d8f06f5703b7cc23d95b9c668a9 not found: ID does not exist" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.571804 4775 scope.go:117] "RemoveContainer" containerID="64ee8494cd8b2f7293109016ab3329239eba909a6f402b28d52f228ac6865b5a" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.574457 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ae6daf4a-5550-44e9-a0bd-11bc6527ad5d" (UID: "ae6daf4a-5550-44e9-a0bd-11bc6527ad5d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.588694 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-config" (OuterVolumeSpecName: "config") pod "ae6daf4a-5550-44e9-a0bd-11bc6527ad5d" (UID: "ae6daf4a-5550-44e9-a0bd-11bc6527ad5d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.602272 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76beac71-bf66-45ec-8a1f-5f6ed8122888-config-data" (OuterVolumeSpecName: "config-data") pod "76beac71-bf66-45ec-8a1f-5f6ed8122888" (UID: "76beac71-bf66-45ec-8a1f-5f6ed8122888"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.611585 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.614713 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.614738 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76beac71-bf66-45ec-8a1f-5f6ed8122888-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.614755 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.614765 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.614796 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.614809 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4775]: I1216 15:16:40.938316 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-78c456ddf7-dbpj2" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.026755 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42b8d55-b8c7-4982-9cef-714706199d4a-config-data\") pod \"e42b8d55-b8c7-4982-9cef-714706199d4a\" (UID: \"e42b8d55-b8c7-4982-9cef-714706199d4a\") " Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.026828 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwj5k\" (UniqueName: \"kubernetes.io/projected/e42b8d55-b8c7-4982-9cef-714706199d4a-kube-api-access-dwj5k\") pod \"e42b8d55-b8c7-4982-9cef-714706199d4a\" (UID: \"e42b8d55-b8c7-4982-9cef-714706199d4a\") " Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.026867 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42b8d55-b8c7-4982-9cef-714706199d4a-combined-ca-bundle\") pod \"e42b8d55-b8c7-4982-9cef-714706199d4a\" (UID: \"e42b8d55-b8c7-4982-9cef-714706199d4a\") " Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.027090 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e42b8d55-b8c7-4982-9cef-714706199d4a-config-data-custom\") pod \"e42b8d55-b8c7-4982-9cef-714706199d4a\" (UID: \"e42b8d55-b8c7-4982-9cef-714706199d4a\") " Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.058531 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e42b8d55-b8c7-4982-9cef-714706199d4a-kube-api-access-dwj5k" (OuterVolumeSpecName: "kube-api-access-dwj5k") pod "e42b8d55-b8c7-4982-9cef-714706199d4a" (UID: "e42b8d55-b8c7-4982-9cef-714706199d4a"). InnerVolumeSpecName "kube-api-access-dwj5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.073035 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42b8d55-b8c7-4982-9cef-714706199d4a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e42b8d55-b8c7-4982-9cef-714706199d4a" (UID: "e42b8d55-b8c7-4982-9cef-714706199d4a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.088468 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42b8d55-b8c7-4982-9cef-714706199d4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e42b8d55-b8c7-4982-9cef-714706199d4a" (UID: "e42b8d55-b8c7-4982-9cef-714706199d4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.103041 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.130984 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e42b8d55-b8c7-4982-9cef-714706199d4a-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.131241 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwj5k\" (UniqueName: \"kubernetes.io/projected/e42b8d55-b8c7-4982-9cef-714706199d4a-kube-api-access-dwj5k\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.131315 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42b8d55-b8c7-4982-9cef-714706199d4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.150992 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-76487b4cc4-bpbgr" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.155025 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-76487b4cc4-bpbgr" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.155410 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42b8d55-b8c7-4982-9cef-714706199d4a-config-data" (OuterVolumeSpecName: "config-data") pod "e42b8d55-b8c7-4982-9cef-714706199d4a" (UID: "e42b8d55-b8c7-4982-9cef-714706199d4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.172348 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.172404 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.235165 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42b8d55-b8c7-4982-9cef-714706199d4a-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.293230 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8vlhw" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.297638 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-41a9-account-create-update-fnj2g" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.327227 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9435-account-create-update-td6k7" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.336407 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndk5g\" (UniqueName: \"kubernetes.io/projected/382550fb-c9fc-4100-a196-8ab11975d0ad-kube-api-access-ndk5g\") pod \"382550fb-c9fc-4100-a196-8ab11975d0ad\" (UID: \"382550fb-c9fc-4100-a196-8ab11975d0ad\") " Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.336506 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx94s\" (UniqueName: \"kubernetes.io/projected/a87cdc28-aa31-4446-a1a6-e0904f9daa62-kube-api-access-gx94s\") pod \"a87cdc28-aa31-4446-a1a6-e0904f9daa62\" (UID: \"a87cdc28-aa31-4446-a1a6-e0904f9daa62\") " Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.336539 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a87cdc28-aa31-4446-a1a6-e0904f9daa62-operator-scripts\") pod \"a87cdc28-aa31-4446-a1a6-e0904f9daa62\" (UID: \"a87cdc28-aa31-4446-a1a6-e0904f9daa62\") " Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.336616 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/382550fb-c9fc-4100-a196-8ab11975d0ad-operator-scripts\") pod \"382550fb-c9fc-4100-a196-8ab11975d0ad\" (UID: \"382550fb-c9fc-4100-a196-8ab11975d0ad\") " Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.337420 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/382550fb-c9fc-4100-a196-8ab11975d0ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "382550fb-c9fc-4100-a196-8ab11975d0ad" (UID: "382550fb-c9fc-4100-a196-8ab11975d0ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.340149 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a87cdc28-aa31-4446-a1a6-e0904f9daa62-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a87cdc28-aa31-4446-a1a6-e0904f9daa62" (UID: "a87cdc28-aa31-4446-a1a6-e0904f9daa62"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.342160 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a87cdc28-aa31-4446-a1a6-e0904f9daa62-kube-api-access-gx94s" (OuterVolumeSpecName: "kube-api-access-gx94s") pod "a87cdc28-aa31-4446-a1a6-e0904f9daa62" (UID: "a87cdc28-aa31-4446-a1a6-e0904f9daa62"). InnerVolumeSpecName "kube-api-access-gx94s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.348498 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/382550fb-c9fc-4100-a196-8ab11975d0ad-kube-api-access-ndk5g" (OuterVolumeSpecName: "kube-api-access-ndk5g") pod "382550fb-c9fc-4100-a196-8ab11975d0ad" (UID: "382550fb-c9fc-4100-a196-8ab11975d0ad"). InnerVolumeSpecName "kube-api-access-ndk5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.437967 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c18fb7c5-60d7-497d-8031-4d3c073104a6-operator-scripts\") pod \"c18fb7c5-60d7-497d-8031-4d3c073104a6\" (UID: \"c18fb7c5-60d7-497d-8031-4d3c073104a6\") " Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.438026 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq74f\" (UniqueName: \"kubernetes.io/projected/c18fb7c5-60d7-497d-8031-4d3c073104a6-kube-api-access-fq74f\") pod \"c18fb7c5-60d7-497d-8031-4d3c073104a6\" (UID: \"c18fb7c5-60d7-497d-8031-4d3c073104a6\") " Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.438404 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c18fb7c5-60d7-497d-8031-4d3c073104a6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c18fb7c5-60d7-497d-8031-4d3c073104a6" (UID: "c18fb7c5-60d7-497d-8031-4d3c073104a6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.438829 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndk5g\" (UniqueName: \"kubernetes.io/projected/382550fb-c9fc-4100-a196-8ab11975d0ad-kube-api-access-ndk5g\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.438847 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx94s\" (UniqueName: \"kubernetes.io/projected/a87cdc28-aa31-4446-a1a6-e0904f9daa62-kube-api-access-gx94s\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.438857 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a87cdc28-aa31-4446-a1a6-e0904f9daa62-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.438869 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/382550fb-c9fc-4100-a196-8ab11975d0ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.438876 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c18fb7c5-60d7-497d-8031-4d3c073104a6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.442307 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c18fb7c5-60d7-497d-8031-4d3c073104a6-kube-api-access-fq74f" (OuterVolumeSpecName: "kube-api-access-fq74f") pod "c18fb7c5-60d7-497d-8031-4d3c073104a6" (UID: "c18fb7c5-60d7-497d-8031-4d3c073104a6"). InnerVolumeSpecName "kube-api-access-fq74f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.491822 4775 scope.go:117] "RemoveContainer" containerID="c00d1eca1370ec628d48f9aa288a5ce69131eb1bd061b18df84d2c6371b539b1" Dec 16 15:16:41 crc kubenswrapper[4775]: E1216 15:16:41.492145 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-76487b4cc4-bpbgr_openstack(2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d)\"" pod="openstack/heat-api-76487b4cc4-bpbgr" podUID="2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.492679 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"521e4d08-bfb5-4043-bc0f-7515dbeb467f","Type":"ContainerStarted","Data":"71fbabe8ccd1478fed14c176b3d9709463efcc8fb4a65c26d0d65b972cc8b93e"} Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.495064 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.508294 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-41a9-account-create-update-fnj2g" event={"ID":"a87cdc28-aa31-4446-a1a6-e0904f9daa62","Type":"ContainerDied","Data":"5f79622324d27df05cd2f7e6497a4c0e36295ff0ad0ad7dac5ab6430bf8465bb"} Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.508387 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f79622324d27df05cd2f7e6497a4c0e36295ff0ad0ad7dac5ab6430bf8465bb" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.508446 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-41a9-account-create-update-fnj2g" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.511806 4775 scope.go:117] "RemoveContainer" containerID="5ab773b0a14bc1a03f2a838c08d2cdd2505d2f815f150d57b5edea996c70cf7a" Dec 16 15:16:41 crc kubenswrapper[4775]: E1216 15:16:41.512154 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-59c7c5dfbf-8495r_openstack(402ffd59-e84f-4e09-9d8d-d89c6c788547)\"" pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" podUID="402ffd59-e84f-4e09-9d8d-d89c6c788547" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.512197 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7732fee4-0518-41db-be31-b9c7ae4aca6b","Type":"ContainerStarted","Data":"53bee0e56cf397c523e9626f36a642460459ec3f854fc4729cbfe5678a0876ee"} Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.514766 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-78c456ddf7-dbpj2" event={"ID":"e42b8d55-b8c7-4982-9cef-714706199d4a","Type":"ContainerDied","Data":"c2c049ee0fafef25606f61695fc4588b5ca2c4e8c6a25b19b3dc7ee546c2a37d"} Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.514806 4775 scope.go:117] "RemoveContainer" containerID="6b76060f99541f01e9151fd0b91e13f69bb8e5c7d73c1928f85fd943513c9c2c" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.516329 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-78c456ddf7-dbpj2" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.524578 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0e61daa0-8bea-4632-8936-5fb68d555ab1","Type":"ContainerStarted","Data":"e99c5f68ee7b79e0d9278b5aca335bb6377e851d1497e53d26a419d2d7e4a584"} Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.531571 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.531552677 podStartE2EDuration="6.531552677s" podCreationTimestamp="2025-12-16 15:16:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:41.520872829 +0000 UTC m=+1326.471951752" watchObservedRunningTime="2025-12-16 15:16:41.531552677 +0000 UTC m=+1326.482631600" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.539068 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3399b2a-e8f1-442d-be74-160e5524608b","Type":"ContainerStarted","Data":"75a1e84b4af0d73dbe983571a16ae08b388d40d8c9dfee0dfc96d8c0c81f508d"} Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.540519 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq74f\" (UniqueName: \"kubernetes.io/projected/c18fb7c5-60d7-497d-8031-4d3c073104a6-kube-api-access-fq74f\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.552331 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9435-account-create-update-td6k7" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.552385 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9435-account-create-update-td6k7" event={"ID":"c18fb7c5-60d7-497d-8031-4d3c073104a6","Type":"ContainerDied","Data":"7865649c4b20ae8bea202168daa6882481e37dbbb48e96646f8d6a5da562f156"} Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.552442 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7865649c4b20ae8bea202168daa6882481e37dbbb48e96646f8d6a5da562f156" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.560219 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-2xd25" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.564694 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-65646f4f55-j22ds" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.565042 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8vlhw" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.567904 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8vlhw" event={"ID":"382550fb-c9fc-4100-a196-8ab11975d0ad","Type":"ContainerDied","Data":"7f2820803b30eb5fcb93179ee23c8d3177640fe2f3c423f4324a83d5acb92dad"} Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.567960 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f2820803b30eb5fcb93179ee23c8d3177640fe2f3c423f4324a83d5acb92dad" Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.853149 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-78c456ddf7-dbpj2"] Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.872603 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-78c456ddf7-dbpj2"] Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.880643 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-65646f4f55-j22ds"] Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.895091 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-65646f4f55-j22ds"] Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.906850 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-2xd25"] Dec 16 15:16:41 crc kubenswrapper[4775]: I1216 15:16:41.920853 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-2xd25"] Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.580006 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0e61daa0-8bea-4632-8936-5fb68d555ab1","Type":"ContainerStarted","Data":"d2203263510705e2ccf026bc98a8c753e6075c824a91099c00607cca359f4aa8"} Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.580433 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0e61daa0-8bea-4632-8936-5fb68d555ab1","Type":"ContainerStarted","Data":"4509dbcc71934421a1c923c92502bf6005298abee453879f38ddc52e84cb8cf8"} Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.581581 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7732fee4-0518-41db-be31-b9c7ae4aca6b","Type":"ContainerStarted","Data":"27b6995602678ec62db96b62ab77850dfadfdbc771dc0fb0e4c3fa977edd92c8"} Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.582507 4775 scope.go:117] "RemoveContainer" containerID="5ab773b0a14bc1a03f2a838c08d2cdd2505d2f815f150d57b5edea996c70cf7a" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.582557 4775 scope.go:117] "RemoveContainer" containerID="c00d1eca1370ec628d48f9aa288a5ce69131eb1bd061b18df84d2c6371b539b1" Dec 16 15:16:42 crc kubenswrapper[4775]: E1216 15:16:42.582718 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-59c7c5dfbf-8495r_openstack(402ffd59-e84f-4e09-9d8d-d89c6c788547)\"" pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" podUID="402ffd59-e84f-4e09-9d8d-d89c6c788547" Dec 16 15:16:42 crc kubenswrapper[4775]: E1216 15:16:42.582764 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-76487b4cc4-bpbgr_openstack(2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d)\"" pod="openstack/heat-api-76487b4cc4-bpbgr" podUID="2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.607157 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.607089479 podStartE2EDuration="4.607089479s" podCreationTimestamp="2025-12-16 15:16:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:42.6045764 +0000 UTC m=+1327.555655323" watchObservedRunningTime="2025-12-16 15:16:42.607089479 +0000 UTC m=+1327.558168422" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.781655 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6sj6z"] Dec 16 15:16:42 crc kubenswrapper[4775]: E1216 15:16:42.782059 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae6daf4a-5550-44e9-a0bd-11bc6527ad5d" containerName="dnsmasq-dns" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.782074 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae6daf4a-5550-44e9-a0bd-11bc6527ad5d" containerName="dnsmasq-dns" Dec 16 15:16:42 crc kubenswrapper[4775]: E1216 15:16:42.782088 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382550fb-c9fc-4100-a196-8ab11975d0ad" containerName="mariadb-database-create" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.782095 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="382550fb-c9fc-4100-a196-8ab11975d0ad" containerName="mariadb-database-create" Dec 16 15:16:42 crc kubenswrapper[4775]: E1216 15:16:42.782106 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18fb7c5-60d7-497d-8031-4d3c073104a6" containerName="mariadb-account-create-update" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.782114 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18fb7c5-60d7-497d-8031-4d3c073104a6" containerName="mariadb-account-create-update" Dec 16 15:16:42 crc kubenswrapper[4775]: E1216 15:16:42.782127 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="454beaa2-a30a-4b5f-bb64-95eafaa20360" containerName="mariadb-database-create" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.782133 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="454beaa2-a30a-4b5f-bb64-95eafaa20360" containerName="mariadb-database-create" Dec 16 15:16:42 crc kubenswrapper[4775]: E1216 15:16:42.782142 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76beac71-bf66-45ec-8a1f-5f6ed8122888" containerName="heat-cfnapi" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.782147 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="76beac71-bf66-45ec-8a1f-5f6ed8122888" containerName="heat-cfnapi" Dec 16 15:16:42 crc kubenswrapper[4775]: E1216 15:16:42.782162 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae6daf4a-5550-44e9-a0bd-11bc6527ad5d" containerName="init" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.782168 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae6daf4a-5550-44e9-a0bd-11bc6527ad5d" containerName="init" Dec 16 15:16:42 crc kubenswrapper[4775]: E1216 15:16:42.782176 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c39639-d2db-4abd-a474-d141a0d0af35" containerName="mariadb-account-create-update" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.782182 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c39639-d2db-4abd-a474-d141a0d0af35" containerName="mariadb-account-create-update" Dec 16 15:16:42 crc kubenswrapper[4775]: E1216 15:16:42.782200 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42b8d55-b8c7-4982-9cef-714706199d4a" containerName="heat-api" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.782207 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42b8d55-b8c7-4982-9cef-714706199d4a" containerName="heat-api" Dec 16 15:16:42 crc kubenswrapper[4775]: E1216 15:16:42.782221 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a87cdc28-aa31-4446-a1a6-e0904f9daa62" containerName="mariadb-account-create-update" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.782227 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a87cdc28-aa31-4446-a1a6-e0904f9daa62" containerName="mariadb-account-create-update" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.782386 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42b8d55-b8c7-4982-9cef-714706199d4a" containerName="heat-api" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.782394 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="76beac71-bf66-45ec-8a1f-5f6ed8122888" containerName="heat-cfnapi" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.782404 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a87cdc28-aa31-4446-a1a6-e0904f9daa62" containerName="mariadb-account-create-update" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.782419 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5c39639-d2db-4abd-a474-d141a0d0af35" containerName="mariadb-account-create-update" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.782431 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="382550fb-c9fc-4100-a196-8ab11975d0ad" containerName="mariadb-database-create" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.782440 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="454beaa2-a30a-4b5f-bb64-95eafaa20360" containerName="mariadb-database-create" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.782447 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae6daf4a-5550-44e9-a0bd-11bc6527ad5d" containerName="dnsmasq-dns" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.782455 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c18fb7c5-60d7-497d-8031-4d3c073104a6" containerName="mariadb-account-create-update" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.783085 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6sj6z" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.785740 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.797089 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6sj6z"] Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.804409 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.813026 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7xsqc" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.872344 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/862dcea5-6162-4150-84d2-69baeced1f01-config-data\") pod \"nova-cell0-conductor-db-sync-6sj6z\" (UID: \"862dcea5-6162-4150-84d2-69baeced1f01\") " pod="openstack/nova-cell0-conductor-db-sync-6sj6z" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.872421 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pfdb\" (UniqueName: \"kubernetes.io/projected/862dcea5-6162-4150-84d2-69baeced1f01-kube-api-access-9pfdb\") pod \"nova-cell0-conductor-db-sync-6sj6z\" (UID: \"862dcea5-6162-4150-84d2-69baeced1f01\") " pod="openstack/nova-cell0-conductor-db-sync-6sj6z" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.872445 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/862dcea5-6162-4150-84d2-69baeced1f01-scripts\") pod \"nova-cell0-conductor-db-sync-6sj6z\" (UID: \"862dcea5-6162-4150-84d2-69baeced1f01\") " pod="openstack/nova-cell0-conductor-db-sync-6sj6z" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.872534 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/862dcea5-6162-4150-84d2-69baeced1f01-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6sj6z\" (UID: \"862dcea5-6162-4150-84d2-69baeced1f01\") " pod="openstack/nova-cell0-conductor-db-sync-6sj6z" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.973584 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pfdb\" (UniqueName: \"kubernetes.io/projected/862dcea5-6162-4150-84d2-69baeced1f01-kube-api-access-9pfdb\") pod \"nova-cell0-conductor-db-sync-6sj6z\" (UID: \"862dcea5-6162-4150-84d2-69baeced1f01\") " pod="openstack/nova-cell0-conductor-db-sync-6sj6z" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.973633 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/862dcea5-6162-4150-84d2-69baeced1f01-scripts\") pod \"nova-cell0-conductor-db-sync-6sj6z\" (UID: \"862dcea5-6162-4150-84d2-69baeced1f01\") " pod="openstack/nova-cell0-conductor-db-sync-6sj6z" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.973728 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/862dcea5-6162-4150-84d2-69baeced1f01-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6sj6z\" (UID: \"862dcea5-6162-4150-84d2-69baeced1f01\") " pod="openstack/nova-cell0-conductor-db-sync-6sj6z" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.973765 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/862dcea5-6162-4150-84d2-69baeced1f01-config-data\") pod \"nova-cell0-conductor-db-sync-6sj6z\" (UID: \"862dcea5-6162-4150-84d2-69baeced1f01\") " pod="openstack/nova-cell0-conductor-db-sync-6sj6z" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.979454 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/862dcea5-6162-4150-84d2-69baeced1f01-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6sj6z\" (UID: \"862dcea5-6162-4150-84d2-69baeced1f01\") " pod="openstack/nova-cell0-conductor-db-sync-6sj6z" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.979653 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/862dcea5-6162-4150-84d2-69baeced1f01-config-data\") pod \"nova-cell0-conductor-db-sync-6sj6z\" (UID: \"862dcea5-6162-4150-84d2-69baeced1f01\") " pod="openstack/nova-cell0-conductor-db-sync-6sj6z" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.979849 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/862dcea5-6162-4150-84d2-69baeced1f01-scripts\") pod \"nova-cell0-conductor-db-sync-6sj6z\" (UID: \"862dcea5-6162-4150-84d2-69baeced1f01\") " pod="openstack/nova-cell0-conductor-db-sync-6sj6z" Dec 16 15:16:42 crc kubenswrapper[4775]: I1216 15:16:42.995228 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pfdb\" (UniqueName: \"kubernetes.io/projected/862dcea5-6162-4150-84d2-69baeced1f01-kube-api-access-9pfdb\") pod \"nova-cell0-conductor-db-sync-6sj6z\" (UID: \"862dcea5-6162-4150-84d2-69baeced1f01\") " pod="openstack/nova-cell0-conductor-db-sync-6sj6z" Dec 16 15:16:43 crc kubenswrapper[4775]: I1216 15:16:43.123756 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6sj6z" Dec 16 15:16:43 crc kubenswrapper[4775]: I1216 15:16:43.352252 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76beac71-bf66-45ec-8a1f-5f6ed8122888" path="/var/lib/kubelet/pods/76beac71-bf66-45ec-8a1f-5f6ed8122888/volumes" Dec 16 15:16:43 crc kubenswrapper[4775]: I1216 15:16:43.353330 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae6daf4a-5550-44e9-a0bd-11bc6527ad5d" path="/var/lib/kubelet/pods/ae6daf4a-5550-44e9-a0bd-11bc6527ad5d/volumes" Dec 16 15:16:43 crc kubenswrapper[4775]: I1216 15:16:43.354253 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e42b8d55-b8c7-4982-9cef-714706199d4a" path="/var/lib/kubelet/pods/e42b8d55-b8c7-4982-9cef-714706199d4a/volumes" Dec 16 15:16:43 crc kubenswrapper[4775]: I1216 15:16:43.592720 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7732fee4-0518-41db-be31-b9c7ae4aca6b","Type":"ContainerStarted","Data":"75db5de456bdb602b118776ef841994f9eecaae68ea6d6ee986d7ced992f28da"} Dec 16 15:16:43 crc kubenswrapper[4775]: I1216 15:16:43.624540 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.624516838 podStartE2EDuration="5.624516838s" podCreationTimestamp="2025-12-16 15:16:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:43.618433836 +0000 UTC m=+1328.569512759" watchObservedRunningTime="2025-12-16 15:16:43.624516838 +0000 UTC m=+1328.575595761" Dec 16 15:16:43 crc kubenswrapper[4775]: I1216 15:16:43.698298 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6sj6z"] Dec 16 15:16:44 crc kubenswrapper[4775]: I1216 15:16:44.001730 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-c49bc9464-wb445" Dec 16 15:16:44 crc kubenswrapper[4775]: I1216 15:16:44.610357 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3399b2a-e8f1-442d-be74-160e5524608b","Type":"ContainerStarted","Data":"eec06c67b6250f4b3823ca4858457bdc670ea32a180ada881ca778a467e113e0"} Dec 16 15:16:44 crc kubenswrapper[4775]: I1216 15:16:44.610560 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3399b2a-e8f1-442d-be74-160e5524608b" containerName="ceilometer-central-agent" containerID="cri-o://edd8a8fdc8eb347f32ec09584f5f673b84dc2a4af5db33fd67bbb0286de9deb4" gracePeriod=30 Dec 16 15:16:44 crc kubenswrapper[4775]: I1216 15:16:44.610647 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3399b2a-e8f1-442d-be74-160e5524608b" containerName="proxy-httpd" containerID="cri-o://eec06c67b6250f4b3823ca4858457bdc670ea32a180ada881ca778a467e113e0" gracePeriod=30 Dec 16 15:16:44 crc kubenswrapper[4775]: I1216 15:16:44.610658 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 15:16:44 crc kubenswrapper[4775]: I1216 15:16:44.610702 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3399b2a-e8f1-442d-be74-160e5524608b" containerName="sg-core" containerID="cri-o://75a1e84b4af0d73dbe983571a16ae08b388d40d8c9dfee0dfc96d8c0c81f508d" gracePeriod=30 Dec 16 15:16:44 crc kubenswrapper[4775]: I1216 15:16:44.610743 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3399b2a-e8f1-442d-be74-160e5524608b" containerName="ceilometer-notification-agent" containerID="cri-o://93ac49b7de24427eb8d6e5965ec6d3292b7d7de1711710d6380d12dd02573032" gracePeriod=30 Dec 16 15:16:44 crc kubenswrapper[4775]: I1216 15:16:44.620407 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6sj6z" event={"ID":"862dcea5-6162-4150-84d2-69baeced1f01","Type":"ContainerStarted","Data":"5e7bb13b1f12b747faca739b443674db321306eb23c2a7cfd798f893be186c79"} Dec 16 15:16:44 crc kubenswrapper[4775]: I1216 15:16:44.651651 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.593008587 podStartE2EDuration="11.651629403s" podCreationTimestamp="2025-12-16 15:16:33 +0000 UTC" firstStartedPulling="2025-12-16 15:16:36.817259013 +0000 UTC m=+1321.768337936" lastFinishedPulling="2025-12-16 15:16:43.875879829 +0000 UTC m=+1328.826958752" observedRunningTime="2025-12-16 15:16:44.63629851 +0000 UTC m=+1329.587377443" watchObservedRunningTime="2025-12-16 15:16:44.651629403 +0000 UTC m=+1329.602708326" Dec 16 15:16:45 crc kubenswrapper[4775]: I1216 15:16:45.635609 4775 generic.go:334] "Generic (PLEG): container finished" podID="f3399b2a-e8f1-442d-be74-160e5524608b" containerID="eec06c67b6250f4b3823ca4858457bdc670ea32a180ada881ca778a467e113e0" exitCode=0 Dec 16 15:16:45 crc kubenswrapper[4775]: I1216 15:16:45.635922 4775 generic.go:334] "Generic (PLEG): container finished" podID="f3399b2a-e8f1-442d-be74-160e5524608b" containerID="75a1e84b4af0d73dbe983571a16ae08b388d40d8c9dfee0dfc96d8c0c81f508d" exitCode=2 Dec 16 15:16:45 crc kubenswrapper[4775]: I1216 15:16:45.635934 4775 generic.go:334] "Generic (PLEG): container finished" podID="f3399b2a-e8f1-442d-be74-160e5524608b" containerID="93ac49b7de24427eb8d6e5965ec6d3292b7d7de1711710d6380d12dd02573032" exitCode=0 Dec 16 15:16:45 crc kubenswrapper[4775]: I1216 15:16:45.635680 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3399b2a-e8f1-442d-be74-160e5524608b","Type":"ContainerDied","Data":"eec06c67b6250f4b3823ca4858457bdc670ea32a180ada881ca778a467e113e0"} Dec 16 15:16:45 crc kubenswrapper[4775]: I1216 15:16:45.635969 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3399b2a-e8f1-442d-be74-160e5524608b","Type":"ContainerDied","Data":"75a1e84b4af0d73dbe983571a16ae08b388d40d8c9dfee0dfc96d8c0c81f508d"} Dec 16 15:16:45 crc kubenswrapper[4775]: I1216 15:16:45.635985 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3399b2a-e8f1-442d-be74-160e5524608b","Type":"ContainerDied","Data":"93ac49b7de24427eb8d6e5965ec6d3292b7d7de1711710d6380d12dd02573032"} Dec 16 15:16:45 crc kubenswrapper[4775]: I1216 15:16:45.871540 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-69676fb7c9-tmm27" Dec 16 15:16:45 crc kubenswrapper[4775]: I1216 15:16:45.984143 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-59c7c5dfbf-8495r"] Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.212391 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-55844f6789-qwjbq" Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.327289 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-76487b4cc4-bpbgr"] Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.536117 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.650734 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" event={"ID":"402ffd59-e84f-4e09-9d8d-d89c6c788547","Type":"ContainerDied","Data":"d76fa5d418702bdf688b714aa03bb5b5117f367aef89cd62b9e660be65f98183"} Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.651116 4775 scope.go:117] "RemoveContainer" containerID="5ab773b0a14bc1a03f2a838c08d2cdd2505d2f815f150d57b5edea996c70cf7a" Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.651188 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-59c7c5dfbf-8495r" Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.660748 4775 generic.go:334] "Generic (PLEG): container finished" podID="f3399b2a-e8f1-442d-be74-160e5524608b" containerID="edd8a8fdc8eb347f32ec09584f5f673b84dc2a4af5db33fd67bbb0286de9deb4" exitCode=0 Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.660810 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3399b2a-e8f1-442d-be74-160e5524608b","Type":"ContainerDied","Data":"edd8a8fdc8eb347f32ec09584f5f673b84dc2a4af5db33fd67bbb0286de9deb4"} Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.672586 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njxrq\" (UniqueName: \"kubernetes.io/projected/402ffd59-e84f-4e09-9d8d-d89c6c788547-kube-api-access-njxrq\") pod \"402ffd59-e84f-4e09-9d8d-d89c6c788547\" (UID: \"402ffd59-e84f-4e09-9d8d-d89c6c788547\") " Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.672661 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/402ffd59-e84f-4e09-9d8d-d89c6c788547-config-data\") pod \"402ffd59-e84f-4e09-9d8d-d89c6c788547\" (UID: \"402ffd59-e84f-4e09-9d8d-d89c6c788547\") " Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.672844 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402ffd59-e84f-4e09-9d8d-d89c6c788547-combined-ca-bundle\") pod \"402ffd59-e84f-4e09-9d8d-d89c6c788547\" (UID: \"402ffd59-e84f-4e09-9d8d-d89c6c788547\") " Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.673040 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/402ffd59-e84f-4e09-9d8d-d89c6c788547-config-data-custom\") pod \"402ffd59-e84f-4e09-9d8d-d89c6c788547\" (UID: \"402ffd59-e84f-4e09-9d8d-d89c6c788547\") " Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.679263 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/402ffd59-e84f-4e09-9d8d-d89c6c788547-kube-api-access-njxrq" (OuterVolumeSpecName: "kube-api-access-njxrq") pod "402ffd59-e84f-4e09-9d8d-d89c6c788547" (UID: "402ffd59-e84f-4e09-9d8d-d89c6c788547"). InnerVolumeSpecName "kube-api-access-njxrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.679645 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402ffd59-e84f-4e09-9d8d-d89c6c788547-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "402ffd59-e84f-4e09-9d8d-d89c6c788547" (UID: "402ffd59-e84f-4e09-9d8d-d89c6c788547"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.718110 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402ffd59-e84f-4e09-9d8d-d89c6c788547-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "402ffd59-e84f-4e09-9d8d-d89c6c788547" (UID: "402ffd59-e84f-4e09-9d8d-d89c6c788547"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.753813 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402ffd59-e84f-4e09-9d8d-d89c6c788547-config-data" (OuterVolumeSpecName: "config-data") pod "402ffd59-e84f-4e09-9d8d-d89c6c788547" (UID: "402ffd59-e84f-4e09-9d8d-d89c6c788547"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.792140 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/402ffd59-e84f-4e09-9d8d-d89c6c788547-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.792195 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njxrq\" (UniqueName: \"kubernetes.io/projected/402ffd59-e84f-4e09-9d8d-d89c6c788547-kube-api-access-njxrq\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.792217 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/402ffd59-e84f-4e09-9d8d-d89c6c788547-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.792243 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402ffd59-e84f-4e09-9d8d-d89c6c788547-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:46 crc kubenswrapper[4775]: I1216 15:16:46.877041 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-76487b4cc4-bpbgr" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:46.997482 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-config-data\") pod \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\" (UID: \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\") " Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:46.997561 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-combined-ca-bundle\") pod \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\" (UID: \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\") " Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:46.997668 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cntm2\" (UniqueName: \"kubernetes.io/projected/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-kube-api-access-cntm2\") pod \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\" (UID: \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\") " Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:46.997879 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-config-data-custom\") pod \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\" (UID: \"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d\") " Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.005483 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d" (UID: "2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.012631 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-59c7c5dfbf-8495r"] Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.013131 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-kube-api-access-cntm2" (OuterVolumeSpecName: "kube-api-access-cntm2") pod "2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d" (UID: "2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d"). InnerVolumeSpecName "kube-api-access-cntm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.031241 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-59c7c5dfbf-8495r"] Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.041212 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d" (UID: "2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.078055 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.082745 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-config-data" (OuterVolumeSpecName: "config-data") pod "2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d" (UID: "2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.108169 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.108204 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.108218 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.108232 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cntm2\" (UniqueName: \"kubernetes.io/projected/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d-kube-api-access-cntm2\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.209762 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-scripts\") pod \"f3399b2a-e8f1-442d-be74-160e5524608b\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.209865 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-sg-core-conf-yaml\") pod \"f3399b2a-e8f1-442d-be74-160e5524608b\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.209900 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-config-data\") pod \"f3399b2a-e8f1-442d-be74-160e5524608b\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.209923 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct46f\" (UniqueName: \"kubernetes.io/projected/f3399b2a-e8f1-442d-be74-160e5524608b-kube-api-access-ct46f\") pod \"f3399b2a-e8f1-442d-be74-160e5524608b\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.209949 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3399b2a-e8f1-442d-be74-160e5524608b-log-httpd\") pod \"f3399b2a-e8f1-442d-be74-160e5524608b\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.210157 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3399b2a-e8f1-442d-be74-160e5524608b-run-httpd\") pod \"f3399b2a-e8f1-442d-be74-160e5524608b\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.210277 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-combined-ca-bundle\") pod \"f3399b2a-e8f1-442d-be74-160e5524608b\" (UID: \"f3399b2a-e8f1-442d-be74-160e5524608b\") " Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.210941 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3399b2a-e8f1-442d-be74-160e5524608b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f3399b2a-e8f1-442d-be74-160e5524608b" (UID: "f3399b2a-e8f1-442d-be74-160e5524608b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.211712 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3399b2a-e8f1-442d-be74-160e5524608b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f3399b2a-e8f1-442d-be74-160e5524608b" (UID: "f3399b2a-e8f1-442d-be74-160e5524608b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.214979 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3399b2a-e8f1-442d-be74-160e5524608b-kube-api-access-ct46f" (OuterVolumeSpecName: "kube-api-access-ct46f") pod "f3399b2a-e8f1-442d-be74-160e5524608b" (UID: "f3399b2a-e8f1-442d-be74-160e5524608b"). InnerVolumeSpecName "kube-api-access-ct46f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.215924 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-scripts" (OuterVolumeSpecName: "scripts") pod "f3399b2a-e8f1-442d-be74-160e5524608b" (UID: "f3399b2a-e8f1-442d-be74-160e5524608b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.243597 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f3399b2a-e8f1-442d-be74-160e5524608b" (UID: "f3399b2a-e8f1-442d-be74-160e5524608b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.313041 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3399b2a-e8f1-442d-be74-160e5524608b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.313072 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.313081 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.313090 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct46f\" (UniqueName: \"kubernetes.io/projected/f3399b2a-e8f1-442d-be74-160e5524608b-kube-api-access-ct46f\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.313099 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3399b2a-e8f1-442d-be74-160e5524608b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.320801 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3399b2a-e8f1-442d-be74-160e5524608b" (UID: "f3399b2a-e8f1-442d-be74-160e5524608b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.349960 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="402ffd59-e84f-4e09-9d8d-d89c6c788547" path="/var/lib/kubelet/pods/402ffd59-e84f-4e09-9d8d-d89c6c788547/volumes" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.356833 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-config-data" (OuterVolumeSpecName: "config-data") pod "f3399b2a-e8f1-442d-be74-160e5524608b" (UID: "f3399b2a-e8f1-442d-be74-160e5524608b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.418488 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.418524 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3399b2a-e8f1-442d-be74-160e5524608b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.679208 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-76487b4cc4-bpbgr" event={"ID":"2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d","Type":"ContainerDied","Data":"48fcc6b7ca8bdf9ef650e2d3c90a224483e8293a1b2d585269f3afe1f0bcb240"} Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.679270 4775 scope.go:117] "RemoveContainer" containerID="c00d1eca1370ec628d48f9aa288a5ce69131eb1bd061b18df84d2c6371b539b1" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.679549 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-76487b4cc4-bpbgr" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.697548 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3399b2a-e8f1-442d-be74-160e5524608b","Type":"ContainerDied","Data":"2c4e454f47a2dd2a0188abb4d911f6e5b2a23a7f5320851325bdf771093adf80"} Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.697646 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.717586 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-76487b4cc4-bpbgr"] Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.725009 4775 scope.go:117] "RemoveContainer" containerID="eec06c67b6250f4b3823ca4858457bdc670ea32a180ada881ca778a467e113e0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.743015 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-76487b4cc4-bpbgr"] Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.772411 4775 scope.go:117] "RemoveContainer" containerID="75a1e84b4af0d73dbe983571a16ae08b388d40d8c9dfee0dfc96d8c0c81f508d" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.776319 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.791960 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.797237 4775 scope.go:117] "RemoveContainer" containerID="93ac49b7de24427eb8d6e5965ec6d3292b7d7de1711710d6380d12dd02573032" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.804992 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:47 crc kubenswrapper[4775]: E1216 15:16:47.805398 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d" containerName="heat-api" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.805415 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d" containerName="heat-api" Dec 16 15:16:47 crc kubenswrapper[4775]: E1216 15:16:47.805433 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3399b2a-e8f1-442d-be74-160e5524608b" containerName="proxy-httpd" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.805439 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3399b2a-e8f1-442d-be74-160e5524608b" containerName="proxy-httpd" Dec 16 15:16:47 crc kubenswrapper[4775]: E1216 15:16:47.805451 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3399b2a-e8f1-442d-be74-160e5524608b" containerName="sg-core" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.805458 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3399b2a-e8f1-442d-be74-160e5524608b" containerName="sg-core" Dec 16 15:16:47 crc kubenswrapper[4775]: E1216 15:16:47.805471 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3399b2a-e8f1-442d-be74-160e5524608b" containerName="ceilometer-notification-agent" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.805478 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3399b2a-e8f1-442d-be74-160e5524608b" containerName="ceilometer-notification-agent" Dec 16 15:16:47 crc kubenswrapper[4775]: E1216 15:16:47.805505 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3399b2a-e8f1-442d-be74-160e5524608b" containerName="ceilometer-central-agent" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.805512 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3399b2a-e8f1-442d-be74-160e5524608b" containerName="ceilometer-central-agent" Dec 16 15:16:47 crc kubenswrapper[4775]: E1216 15:16:47.805528 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402ffd59-e84f-4e09-9d8d-d89c6c788547" containerName="heat-cfnapi" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.805534 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="402ffd59-e84f-4e09-9d8d-d89c6c788547" containerName="heat-cfnapi" Dec 16 15:16:47 crc kubenswrapper[4775]: E1216 15:16:47.805541 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402ffd59-e84f-4e09-9d8d-d89c6c788547" containerName="heat-cfnapi" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.805547 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="402ffd59-e84f-4e09-9d8d-d89c6c788547" containerName="heat-cfnapi" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.805708 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d" containerName="heat-api" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.805727 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="402ffd59-e84f-4e09-9d8d-d89c6c788547" containerName="heat-cfnapi" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.805736 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3399b2a-e8f1-442d-be74-160e5524608b" containerName="ceilometer-central-agent" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.805748 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3399b2a-e8f1-442d-be74-160e5524608b" containerName="proxy-httpd" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.805760 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3399b2a-e8f1-442d-be74-160e5524608b" containerName="sg-core" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.805766 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="402ffd59-e84f-4e09-9d8d-d89c6c788547" containerName="heat-cfnapi" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.805773 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3399b2a-e8f1-442d-be74-160e5524608b" containerName="ceilometer-notification-agent" Dec 16 15:16:47 crc kubenswrapper[4775]: E1216 15:16:47.806144 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d" containerName="heat-api" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.806170 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d" containerName="heat-api" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.806340 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d" containerName="heat-api" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.807597 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.813463 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.813590 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.827704 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.831543 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.831632 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be773756-aa1d-455c-9eac-a1e053636353-run-httpd\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.831668 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.831859 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-scripts\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.832037 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be773756-aa1d-455c-9eac-a1e053636353-log-httpd\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.832165 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trdk6\" (UniqueName: \"kubernetes.io/projected/be773756-aa1d-455c-9eac-a1e053636353-kube-api-access-trdk6\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.832249 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-config-data\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.844549 4775 scope.go:117] "RemoveContainer" containerID="edd8a8fdc8eb347f32ec09584f5f673b84dc2a4af5db33fd67bbb0286de9deb4" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.933368 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.933425 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be773756-aa1d-455c-9eac-a1e053636353-run-httpd\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.933445 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.933473 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-scripts\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.933509 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be773756-aa1d-455c-9eac-a1e053636353-log-httpd\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.933541 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trdk6\" (UniqueName: \"kubernetes.io/projected/be773756-aa1d-455c-9eac-a1e053636353-kube-api-access-trdk6\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.933561 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-config-data\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.934057 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be773756-aa1d-455c-9eac-a1e053636353-run-httpd\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.934209 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be773756-aa1d-455c-9eac-a1e053636353-log-httpd\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.938346 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-scripts\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.938938 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.941244 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.942777 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-config-data\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:47 crc kubenswrapper[4775]: I1216 15:16:47.951346 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trdk6\" (UniqueName: \"kubernetes.io/projected/be773756-aa1d-455c-9eac-a1e053636353-kube-api-access-trdk6\") pod \"ceilometer-0\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " pod="openstack/ceilometer-0" Dec 16 15:16:48 crc kubenswrapper[4775]: I1216 15:16:48.134899 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:48 crc kubenswrapper[4775]: I1216 15:16:48.659034 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:48 crc kubenswrapper[4775]: I1216 15:16:48.712985 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be773756-aa1d-455c-9eac-a1e053636353","Type":"ContainerStarted","Data":"77c7f2322fbaecc969f73b162bcd7015d1c09c930a2517ff0d324b5af1bbb1af"} Dec 16 15:16:48 crc kubenswrapper[4775]: I1216 15:16:48.826203 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 16 15:16:49 crc kubenswrapper[4775]: I1216 15:16:49.135964 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:49 crc kubenswrapper[4775]: I1216 15:16:49.136031 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:49 crc kubenswrapper[4775]: I1216 15:16:49.167134 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 15:16:49 crc kubenswrapper[4775]: I1216 15:16:49.167179 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 15:16:49 crc kubenswrapper[4775]: I1216 15:16:49.186403 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:49 crc kubenswrapper[4775]: I1216 15:16:49.191948 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:49 crc kubenswrapper[4775]: I1216 15:16:49.254411 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 15:16:49 crc kubenswrapper[4775]: I1216 15:16:49.264265 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 15:16:49 crc kubenswrapper[4775]: I1216 15:16:49.351252 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d" path="/var/lib/kubelet/pods/2eb9d1e3-0f49-4aed-a2f1-f24bcdbfb43d/volumes" Dec 16 15:16:49 crc kubenswrapper[4775]: I1216 15:16:49.351997 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3399b2a-e8f1-442d-be74-160e5524608b" path="/var/lib/kubelet/pods/f3399b2a-e8f1-442d-be74-160e5524608b/volumes" Dec 16 15:16:49 crc kubenswrapper[4775]: I1216 15:16:49.722771 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 15:16:49 crc kubenswrapper[4775]: I1216 15:16:49.722816 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:49 crc kubenswrapper[4775]: I1216 15:16:49.722832 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:49 crc kubenswrapper[4775]: I1216 15:16:49.722843 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 15:16:51 crc kubenswrapper[4775]: I1216 15:16:51.147290 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-59fcc7f56d-krpcl" Dec 16 15:16:51 crc kubenswrapper[4775]: I1216 15:16:51.219678 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-c49bc9464-wb445"] Dec 16 15:16:51 crc kubenswrapper[4775]: I1216 15:16:51.219945 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-c49bc9464-wb445" podUID="eb13cacc-e521-4220-a731-18136d35425c" containerName="heat-engine" containerID="cri-o://8906e4b9e85b5b63b8682c82f8bb1a39eecc76a007eefd56a8310c2c1fc594a8" gracePeriod=60 Dec 16 15:16:52 crc kubenswrapper[4775]: I1216 15:16:52.370526 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:52 crc kubenswrapper[4775]: I1216 15:16:52.371031 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 15:16:52 crc kubenswrapper[4775]: I1216 15:16:52.411574 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:52 crc kubenswrapper[4775]: I1216 15:16:52.685542 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 15:16:52 crc kubenswrapper[4775]: I1216 15:16:52.685946 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 15:16:52 crc kubenswrapper[4775]: I1216 15:16:52.690600 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 15:16:53 crc kubenswrapper[4775]: E1216 15:16:53.969362 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8906e4b9e85b5b63b8682c82f8bb1a39eecc76a007eefd56a8310c2c1fc594a8" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 15:16:53 crc kubenswrapper[4775]: E1216 15:16:53.974238 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8906e4b9e85b5b63b8682c82f8bb1a39eecc76a007eefd56a8310c2c1fc594a8" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 15:16:53 crc kubenswrapper[4775]: E1216 15:16:53.975673 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8906e4b9e85b5b63b8682c82f8bb1a39eecc76a007eefd56a8310c2c1fc594a8" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 15:16:53 crc kubenswrapper[4775]: E1216 15:16:53.975724 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-c49bc9464-wb445" podUID="eb13cacc-e521-4220-a731-18136d35425c" containerName="heat-engine" Dec 16 15:16:57 crc kubenswrapper[4775]: I1216 15:16:57.835513 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be773756-aa1d-455c-9eac-a1e053636353","Type":"ContainerStarted","Data":"1d4864043409240ba3a3333e1ed0ee7ee35781b5d743f5934ebffdc81145e610"} Dec 16 15:16:58 crc kubenswrapper[4775]: I1216 15:16:58.855306 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6sj6z" event={"ID":"862dcea5-6162-4150-84d2-69baeced1f01","Type":"ContainerStarted","Data":"1e50f2d806020593672b0862872199c85bf7b84ee4c99cc922d90b773a8fe8a6"} Dec 16 15:16:58 crc kubenswrapper[4775]: I1216 15:16:58.893975 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-6sj6z" podStartSLOduration=2.377904638 podStartE2EDuration="16.89395114s" podCreationTimestamp="2025-12-16 15:16:42 +0000 UTC" firstStartedPulling="2025-12-16 15:16:43.688465856 +0000 UTC m=+1328.639544779" lastFinishedPulling="2025-12-16 15:16:58.204512358 +0000 UTC m=+1343.155591281" observedRunningTime="2025-12-16 15:16:58.87492887 +0000 UTC m=+1343.826007803" watchObservedRunningTime="2025-12-16 15:16:58.89395114 +0000 UTC m=+1343.845030073" Dec 16 15:16:59 crc kubenswrapper[4775]: I1216 15:16:59.867534 4775 generic.go:334] "Generic (PLEG): container finished" podID="eb13cacc-e521-4220-a731-18136d35425c" containerID="8906e4b9e85b5b63b8682c82f8bb1a39eecc76a007eefd56a8310c2c1fc594a8" exitCode=0 Dec 16 15:16:59 crc kubenswrapper[4775]: I1216 15:16:59.868440 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-c49bc9464-wb445" event={"ID":"eb13cacc-e521-4220-a731-18136d35425c","Type":"ContainerDied","Data":"8906e4b9e85b5b63b8682c82f8bb1a39eecc76a007eefd56a8310c2c1fc594a8"} Dec 16 15:16:59 crc kubenswrapper[4775]: I1216 15:16:59.870720 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-c49bc9464-wb445" event={"ID":"eb13cacc-e521-4220-a731-18136d35425c","Type":"ContainerDied","Data":"cd9c7e198c4dc902113bd14ecc7fdaca2b7993e14db35e20e96a667ba8f072d9"} Dec 16 15:16:59 crc kubenswrapper[4775]: I1216 15:16:59.870749 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd9c7e198c4dc902113bd14ecc7fdaca2b7993e14db35e20e96a667ba8f072d9" Dec 16 15:16:59 crc kubenswrapper[4775]: I1216 15:16:59.885574 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be773756-aa1d-455c-9eac-a1e053636353","Type":"ContainerStarted","Data":"1d06a29ad3e9a206ac4fcd976a399f4eceda6be49b0d943a0c0f3159fddb6c91"} Dec 16 15:16:59 crc kubenswrapper[4775]: I1216 15:16:59.932059 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-c49bc9464-wb445" Dec 16 15:17:00 crc kubenswrapper[4775]: I1216 15:17:00.028012 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clphq\" (UniqueName: \"kubernetes.io/projected/eb13cacc-e521-4220-a731-18136d35425c-kube-api-access-clphq\") pod \"eb13cacc-e521-4220-a731-18136d35425c\" (UID: \"eb13cacc-e521-4220-a731-18136d35425c\") " Dec 16 15:17:00 crc kubenswrapper[4775]: I1216 15:17:00.028057 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb13cacc-e521-4220-a731-18136d35425c-config-data-custom\") pod \"eb13cacc-e521-4220-a731-18136d35425c\" (UID: \"eb13cacc-e521-4220-a731-18136d35425c\") " Dec 16 15:17:00 crc kubenswrapper[4775]: I1216 15:17:00.029411 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb13cacc-e521-4220-a731-18136d35425c-combined-ca-bundle\") pod \"eb13cacc-e521-4220-a731-18136d35425c\" (UID: \"eb13cacc-e521-4220-a731-18136d35425c\") " Dec 16 15:17:00 crc kubenswrapper[4775]: I1216 15:17:00.029465 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb13cacc-e521-4220-a731-18136d35425c-config-data\") pod \"eb13cacc-e521-4220-a731-18136d35425c\" (UID: \"eb13cacc-e521-4220-a731-18136d35425c\") " Dec 16 15:17:00 crc kubenswrapper[4775]: I1216 15:17:00.038517 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb13cacc-e521-4220-a731-18136d35425c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "eb13cacc-e521-4220-a731-18136d35425c" (UID: "eb13cacc-e521-4220-a731-18136d35425c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:00 crc kubenswrapper[4775]: I1216 15:17:00.065533 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb13cacc-e521-4220-a731-18136d35425c-kube-api-access-clphq" (OuterVolumeSpecName: "kube-api-access-clphq") pod "eb13cacc-e521-4220-a731-18136d35425c" (UID: "eb13cacc-e521-4220-a731-18136d35425c"). InnerVolumeSpecName "kube-api-access-clphq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:00 crc kubenswrapper[4775]: I1216 15:17:00.067081 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb13cacc-e521-4220-a731-18136d35425c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb13cacc-e521-4220-a731-18136d35425c" (UID: "eb13cacc-e521-4220-a731-18136d35425c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:00 crc kubenswrapper[4775]: I1216 15:17:00.124442 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb13cacc-e521-4220-a731-18136d35425c-config-data" (OuterVolumeSpecName: "config-data") pod "eb13cacc-e521-4220-a731-18136d35425c" (UID: "eb13cacc-e521-4220-a731-18136d35425c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:00 crc kubenswrapper[4775]: I1216 15:17:00.132071 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clphq\" (UniqueName: \"kubernetes.io/projected/eb13cacc-e521-4220-a731-18136d35425c-kube-api-access-clphq\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:00 crc kubenswrapper[4775]: I1216 15:17:00.132104 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb13cacc-e521-4220-a731-18136d35425c-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:00 crc kubenswrapper[4775]: I1216 15:17:00.132114 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb13cacc-e521-4220-a731-18136d35425c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:00 crc kubenswrapper[4775]: I1216 15:17:00.132124 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb13cacc-e521-4220-a731-18136d35425c-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:00 crc kubenswrapper[4775]: I1216 15:17:00.892333 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-c49bc9464-wb445" Dec 16 15:17:00 crc kubenswrapper[4775]: I1216 15:17:00.931816 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-c49bc9464-wb445"] Dec 16 15:17:00 crc kubenswrapper[4775]: I1216 15:17:00.940481 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-c49bc9464-wb445"] Dec 16 15:17:01 crc kubenswrapper[4775]: I1216 15:17:01.073821 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:17:01 crc kubenswrapper[4775]: I1216 15:17:01.352604 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb13cacc-e521-4220-a731-18136d35425c" path="/var/lib/kubelet/pods/eb13cacc-e521-4220-a731-18136d35425c/volumes" Dec 16 15:17:01 crc kubenswrapper[4775]: I1216 15:17:01.904058 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be773756-aa1d-455c-9eac-a1e053636353","Type":"ContainerStarted","Data":"cd7da3692074ddb68f6c5adc48f0658e17871791e73ff2ebaa33f7ea44ba6208"} Dec 16 15:17:02 crc kubenswrapper[4775]: I1216 15:17:02.916226 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be773756-aa1d-455c-9eac-a1e053636353","Type":"ContainerStarted","Data":"0ac4da1039738e526ecd22162234878767cee77f987b2bb7423bd0a75d7f1ae2"} Dec 16 15:17:02 crc kubenswrapper[4775]: I1216 15:17:02.916584 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 15:17:02 crc kubenswrapper[4775]: I1216 15:17:02.916440 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be773756-aa1d-455c-9eac-a1e053636353" containerName="sg-core" containerID="cri-o://cd7da3692074ddb68f6c5adc48f0658e17871791e73ff2ebaa33f7ea44ba6208" gracePeriod=30 Dec 16 15:17:02 crc kubenswrapper[4775]: I1216 15:17:02.916411 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be773756-aa1d-455c-9eac-a1e053636353" containerName="ceilometer-central-agent" containerID="cri-o://1d4864043409240ba3a3333e1ed0ee7ee35781b5d743f5934ebffdc81145e610" gracePeriod=30 Dec 16 15:17:02 crc kubenswrapper[4775]: I1216 15:17:02.916463 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be773756-aa1d-455c-9eac-a1e053636353" containerName="proxy-httpd" containerID="cri-o://0ac4da1039738e526ecd22162234878767cee77f987b2bb7423bd0a75d7f1ae2" gracePeriod=30 Dec 16 15:17:02 crc kubenswrapper[4775]: I1216 15:17:02.916521 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be773756-aa1d-455c-9eac-a1e053636353" containerName="ceilometer-notification-agent" containerID="cri-o://1d06a29ad3e9a206ac4fcd976a399f4eceda6be49b0d943a0c0f3159fddb6c91" gracePeriod=30 Dec 16 15:17:02 crc kubenswrapper[4775]: I1216 15:17:02.949631 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.020151438 podStartE2EDuration="15.949605874s" podCreationTimestamp="2025-12-16 15:16:47 +0000 UTC" firstStartedPulling="2025-12-16 15:16:48.681530004 +0000 UTC m=+1333.632608927" lastFinishedPulling="2025-12-16 15:17:02.61098444 +0000 UTC m=+1347.562063363" observedRunningTime="2025-12-16 15:17:02.947872599 +0000 UTC m=+1347.898951542" watchObservedRunningTime="2025-12-16 15:17:02.949605874 +0000 UTC m=+1347.900684807" Dec 16 15:17:03 crc kubenswrapper[4775]: I1216 15:17:03.927161 4775 generic.go:334] "Generic (PLEG): container finished" podID="be773756-aa1d-455c-9eac-a1e053636353" containerID="cd7da3692074ddb68f6c5adc48f0658e17871791e73ff2ebaa33f7ea44ba6208" exitCode=2 Dec 16 15:17:03 crc kubenswrapper[4775]: I1216 15:17:03.927646 4775 generic.go:334] "Generic (PLEG): container finished" podID="be773756-aa1d-455c-9eac-a1e053636353" containerID="1d06a29ad3e9a206ac4fcd976a399f4eceda6be49b0d943a0c0f3159fddb6c91" exitCode=0 Dec 16 15:17:03 crc kubenswrapper[4775]: I1216 15:17:03.927313 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be773756-aa1d-455c-9eac-a1e053636353","Type":"ContainerDied","Data":"cd7da3692074ddb68f6c5adc48f0658e17871791e73ff2ebaa33f7ea44ba6208"} Dec 16 15:17:03 crc kubenswrapper[4775]: I1216 15:17:03.927685 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be773756-aa1d-455c-9eac-a1e053636353","Type":"ContainerDied","Data":"1d06a29ad3e9a206ac4fcd976a399f4eceda6be49b0d943a0c0f3159fddb6c91"} Dec 16 15:17:11 crc kubenswrapper[4775]: I1216 15:17:11.020264 4775 generic.go:334] "Generic (PLEG): container finished" podID="862dcea5-6162-4150-84d2-69baeced1f01" containerID="1e50f2d806020593672b0862872199c85bf7b84ee4c99cc922d90b773a8fe8a6" exitCode=0 Dec 16 15:17:11 crc kubenswrapper[4775]: I1216 15:17:11.020348 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6sj6z" event={"ID":"862dcea5-6162-4150-84d2-69baeced1f01","Type":"ContainerDied","Data":"1e50f2d806020593672b0862872199c85bf7b84ee4c99cc922d90b773a8fe8a6"} Dec 16 15:17:12 crc kubenswrapper[4775]: I1216 15:17:12.033867 4775 generic.go:334] "Generic (PLEG): container finished" podID="be773756-aa1d-455c-9eac-a1e053636353" containerID="1d4864043409240ba3a3333e1ed0ee7ee35781b5d743f5934ebffdc81145e610" exitCode=0 Dec 16 15:17:12 crc kubenswrapper[4775]: I1216 15:17:12.033916 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be773756-aa1d-455c-9eac-a1e053636353","Type":"ContainerDied","Data":"1d4864043409240ba3a3333e1ed0ee7ee35781b5d743f5934ebffdc81145e610"} Dec 16 15:17:12 crc kubenswrapper[4775]: I1216 15:17:12.365780 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6sj6z" Dec 16 15:17:12 crc kubenswrapper[4775]: I1216 15:17:12.489332 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/862dcea5-6162-4150-84d2-69baeced1f01-combined-ca-bundle\") pod \"862dcea5-6162-4150-84d2-69baeced1f01\" (UID: \"862dcea5-6162-4150-84d2-69baeced1f01\") " Dec 16 15:17:12 crc kubenswrapper[4775]: I1216 15:17:12.489420 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/862dcea5-6162-4150-84d2-69baeced1f01-config-data\") pod \"862dcea5-6162-4150-84d2-69baeced1f01\" (UID: \"862dcea5-6162-4150-84d2-69baeced1f01\") " Dec 16 15:17:12 crc kubenswrapper[4775]: I1216 15:17:12.489507 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/862dcea5-6162-4150-84d2-69baeced1f01-scripts\") pod \"862dcea5-6162-4150-84d2-69baeced1f01\" (UID: \"862dcea5-6162-4150-84d2-69baeced1f01\") " Dec 16 15:17:12 crc kubenswrapper[4775]: I1216 15:17:12.489539 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pfdb\" (UniqueName: \"kubernetes.io/projected/862dcea5-6162-4150-84d2-69baeced1f01-kube-api-access-9pfdb\") pod \"862dcea5-6162-4150-84d2-69baeced1f01\" (UID: \"862dcea5-6162-4150-84d2-69baeced1f01\") " Dec 16 15:17:12 crc kubenswrapper[4775]: I1216 15:17:12.499736 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/862dcea5-6162-4150-84d2-69baeced1f01-kube-api-access-9pfdb" (OuterVolumeSpecName: "kube-api-access-9pfdb") pod "862dcea5-6162-4150-84d2-69baeced1f01" (UID: "862dcea5-6162-4150-84d2-69baeced1f01"). InnerVolumeSpecName "kube-api-access-9pfdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:12 crc kubenswrapper[4775]: I1216 15:17:12.503110 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/862dcea5-6162-4150-84d2-69baeced1f01-scripts" (OuterVolumeSpecName: "scripts") pod "862dcea5-6162-4150-84d2-69baeced1f01" (UID: "862dcea5-6162-4150-84d2-69baeced1f01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:12 crc kubenswrapper[4775]: I1216 15:17:12.521843 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/862dcea5-6162-4150-84d2-69baeced1f01-config-data" (OuterVolumeSpecName: "config-data") pod "862dcea5-6162-4150-84d2-69baeced1f01" (UID: "862dcea5-6162-4150-84d2-69baeced1f01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:12 crc kubenswrapper[4775]: I1216 15:17:12.522541 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/862dcea5-6162-4150-84d2-69baeced1f01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "862dcea5-6162-4150-84d2-69baeced1f01" (UID: "862dcea5-6162-4150-84d2-69baeced1f01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:12 crc kubenswrapper[4775]: I1216 15:17:12.591576 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/862dcea5-6162-4150-84d2-69baeced1f01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:12 crc kubenswrapper[4775]: I1216 15:17:12.591608 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/862dcea5-6162-4150-84d2-69baeced1f01-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:12 crc kubenswrapper[4775]: I1216 15:17:12.591617 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/862dcea5-6162-4150-84d2-69baeced1f01-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:12 crc kubenswrapper[4775]: I1216 15:17:12.591626 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pfdb\" (UniqueName: \"kubernetes.io/projected/862dcea5-6162-4150-84d2-69baeced1f01-kube-api-access-9pfdb\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.047198 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6sj6z" event={"ID":"862dcea5-6162-4150-84d2-69baeced1f01","Type":"ContainerDied","Data":"5e7bb13b1f12b747faca739b443674db321306eb23c2a7cfd798f893be186c79"} Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.047276 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6sj6z" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.047292 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e7bb13b1f12b747faca739b443674db321306eb23c2a7cfd798f893be186c79" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.139828 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 15:17:13 crc kubenswrapper[4775]: E1216 15:17:13.140564 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb13cacc-e521-4220-a731-18136d35425c" containerName="heat-engine" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.140579 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb13cacc-e521-4220-a731-18136d35425c" containerName="heat-engine" Dec 16 15:17:13 crc kubenswrapper[4775]: E1216 15:17:13.140591 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="862dcea5-6162-4150-84d2-69baeced1f01" containerName="nova-cell0-conductor-db-sync" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.140597 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="862dcea5-6162-4150-84d2-69baeced1f01" containerName="nova-cell0-conductor-db-sync" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.140775 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="862dcea5-6162-4150-84d2-69baeced1f01" containerName="nova-cell0-conductor-db-sync" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.140791 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb13cacc-e521-4220-a731-18136d35425c" containerName="heat-engine" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.143357 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.147068 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7xsqc" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.148381 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.167581 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.206030 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8f9f\" (UniqueName: \"kubernetes.io/projected/af50c3ce-5c89-46eb-bd8c-83346b17ad3d-kube-api-access-v8f9f\") pod \"nova-cell0-conductor-0\" (UID: \"af50c3ce-5c89-46eb-bd8c-83346b17ad3d\") " pod="openstack/nova-cell0-conductor-0" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.206180 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af50c3ce-5c89-46eb-bd8c-83346b17ad3d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"af50c3ce-5c89-46eb-bd8c-83346b17ad3d\") " pod="openstack/nova-cell0-conductor-0" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.206227 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af50c3ce-5c89-46eb-bd8c-83346b17ad3d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"af50c3ce-5c89-46eb-bd8c-83346b17ad3d\") " pod="openstack/nova-cell0-conductor-0" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.307989 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8f9f\" (UniqueName: \"kubernetes.io/projected/af50c3ce-5c89-46eb-bd8c-83346b17ad3d-kube-api-access-v8f9f\") pod \"nova-cell0-conductor-0\" (UID: \"af50c3ce-5c89-46eb-bd8c-83346b17ad3d\") " pod="openstack/nova-cell0-conductor-0" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.308327 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af50c3ce-5c89-46eb-bd8c-83346b17ad3d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"af50c3ce-5c89-46eb-bd8c-83346b17ad3d\") " pod="openstack/nova-cell0-conductor-0" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.308410 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af50c3ce-5c89-46eb-bd8c-83346b17ad3d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"af50c3ce-5c89-46eb-bd8c-83346b17ad3d\") " pod="openstack/nova-cell0-conductor-0" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.312860 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af50c3ce-5c89-46eb-bd8c-83346b17ad3d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"af50c3ce-5c89-46eb-bd8c-83346b17ad3d\") " pod="openstack/nova-cell0-conductor-0" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.319642 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af50c3ce-5c89-46eb-bd8c-83346b17ad3d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"af50c3ce-5c89-46eb-bd8c-83346b17ad3d\") " pod="openstack/nova-cell0-conductor-0" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.332470 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8f9f\" (UniqueName: \"kubernetes.io/projected/af50c3ce-5c89-46eb-bd8c-83346b17ad3d-kube-api-access-v8f9f\") pod \"nova-cell0-conductor-0\" (UID: \"af50c3ce-5c89-46eb-bd8c-83346b17ad3d\") " pod="openstack/nova-cell0-conductor-0" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.468466 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 15:17:13 crc kubenswrapper[4775]: I1216 15:17:13.937502 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 15:17:14 crc kubenswrapper[4775]: I1216 15:17:14.059420 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"af50c3ce-5c89-46eb-bd8c-83346b17ad3d","Type":"ContainerStarted","Data":"33bcc17238266bff6add9af53d761deecb41668bd85411b030c7a998c7b4a695"} Dec 16 15:17:15 crc kubenswrapper[4775]: I1216 15:17:15.069565 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"af50c3ce-5c89-46eb-bd8c-83346b17ad3d","Type":"ContainerStarted","Data":"5e56b26d9f1e3c4e64e646868066561c0c5b66707070a3a20fe1b79c37fe2e63"} Dec 16 15:17:15 crc kubenswrapper[4775]: I1216 15:17:15.069759 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 16 15:17:15 crc kubenswrapper[4775]: I1216 15:17:15.128477 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.1284554 podStartE2EDuration="2.1284554s" podCreationTimestamp="2025-12-16 15:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:17:15.126014223 +0000 UTC m=+1360.077093156" watchObservedRunningTime="2025-12-16 15:17:15.1284554 +0000 UTC m=+1360.079534323" Dec 16 15:17:18 crc kubenswrapper[4775]: I1216 15:17:18.139391 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="be773756-aa1d-455c-9eac-a1e053636353" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 16 15:17:23 crc kubenswrapper[4775]: I1216 15:17:23.506521 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.030817 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-smpjc"] Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.032216 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-smpjc" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.034316 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.036072 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.044471 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-smpjc"] Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.135186 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46586ca7-367e-47d4-bd95-11037f7bb60f-config-data\") pod \"nova-cell0-cell-mapping-smpjc\" (UID: \"46586ca7-367e-47d4-bd95-11037f7bb60f\") " pod="openstack/nova-cell0-cell-mapping-smpjc" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.135329 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw6mj\" (UniqueName: \"kubernetes.io/projected/46586ca7-367e-47d4-bd95-11037f7bb60f-kube-api-access-lw6mj\") pod \"nova-cell0-cell-mapping-smpjc\" (UID: \"46586ca7-367e-47d4-bd95-11037f7bb60f\") " pod="openstack/nova-cell0-cell-mapping-smpjc" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.135412 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46586ca7-367e-47d4-bd95-11037f7bb60f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-smpjc\" (UID: \"46586ca7-367e-47d4-bd95-11037f7bb60f\") " pod="openstack/nova-cell0-cell-mapping-smpjc" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.135624 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46586ca7-367e-47d4-bd95-11037f7bb60f-scripts\") pod \"nova-cell0-cell-mapping-smpjc\" (UID: \"46586ca7-367e-47d4-bd95-11037f7bb60f\") " pod="openstack/nova-cell0-cell-mapping-smpjc" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.242907 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46586ca7-367e-47d4-bd95-11037f7bb60f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-smpjc\" (UID: \"46586ca7-367e-47d4-bd95-11037f7bb60f\") " pod="openstack/nova-cell0-cell-mapping-smpjc" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.247847 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46586ca7-367e-47d4-bd95-11037f7bb60f-scripts\") pod \"nova-cell0-cell-mapping-smpjc\" (UID: \"46586ca7-367e-47d4-bd95-11037f7bb60f\") " pod="openstack/nova-cell0-cell-mapping-smpjc" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.248556 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46586ca7-367e-47d4-bd95-11037f7bb60f-config-data\") pod \"nova-cell0-cell-mapping-smpjc\" (UID: \"46586ca7-367e-47d4-bd95-11037f7bb60f\") " pod="openstack/nova-cell0-cell-mapping-smpjc" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.248739 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw6mj\" (UniqueName: \"kubernetes.io/projected/46586ca7-367e-47d4-bd95-11037f7bb60f-kube-api-access-lw6mj\") pod \"nova-cell0-cell-mapping-smpjc\" (UID: \"46586ca7-367e-47d4-bd95-11037f7bb60f\") " pod="openstack/nova-cell0-cell-mapping-smpjc" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.258137 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46586ca7-367e-47d4-bd95-11037f7bb60f-config-data\") pod \"nova-cell0-cell-mapping-smpjc\" (UID: \"46586ca7-367e-47d4-bd95-11037f7bb60f\") " pod="openstack/nova-cell0-cell-mapping-smpjc" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.258468 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.258345 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46586ca7-367e-47d4-bd95-11037f7bb60f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-smpjc\" (UID: \"46586ca7-367e-47d4-bd95-11037f7bb60f\") " pod="openstack/nova-cell0-cell-mapping-smpjc" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.263445 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46586ca7-367e-47d4-bd95-11037f7bb60f-scripts\") pod \"nova-cell0-cell-mapping-smpjc\" (UID: \"46586ca7-367e-47d4-bd95-11037f7bb60f\") " pod="openstack/nova-cell0-cell-mapping-smpjc" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.296522 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.319342 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.325219 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.351354 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/591cb000-dea7-4b70-a162-211974a0e8a8-logs\") pod \"nova-api-0\" (UID: \"591cb000-dea7-4b70-a162-211974a0e8a8\") " pod="openstack/nova-api-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.351468 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrz4t\" (UniqueName: \"kubernetes.io/projected/591cb000-dea7-4b70-a162-211974a0e8a8-kube-api-access-jrz4t\") pod \"nova-api-0\" (UID: \"591cb000-dea7-4b70-a162-211974a0e8a8\") " pod="openstack/nova-api-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.351517 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591cb000-dea7-4b70-a162-211974a0e8a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"591cb000-dea7-4b70-a162-211974a0e8a8\") " pod="openstack/nova-api-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.351578 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591cb000-dea7-4b70-a162-211974a0e8a8-config-data\") pod \"nova-api-0\" (UID: \"591cb000-dea7-4b70-a162-211974a0e8a8\") " pod="openstack/nova-api-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.352525 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw6mj\" (UniqueName: \"kubernetes.io/projected/46586ca7-367e-47d4-bd95-11037f7bb60f-kube-api-access-lw6mj\") pod \"nova-cell0-cell-mapping-smpjc\" (UID: \"46586ca7-367e-47d4-bd95-11037f7bb60f\") " pod="openstack/nova-cell0-cell-mapping-smpjc" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.354406 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-smpjc" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.461176 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrz4t\" (UniqueName: \"kubernetes.io/projected/591cb000-dea7-4b70-a162-211974a0e8a8-kube-api-access-jrz4t\") pod \"nova-api-0\" (UID: \"591cb000-dea7-4b70-a162-211974a0e8a8\") " pod="openstack/nova-api-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.461209 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.461223 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591cb000-dea7-4b70-a162-211974a0e8a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"591cb000-dea7-4b70-a162-211974a0e8a8\") " pod="openstack/nova-api-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.461252 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591cb000-dea7-4b70-a162-211974a0e8a8-config-data\") pod \"nova-api-0\" (UID: \"591cb000-dea7-4b70-a162-211974a0e8a8\") " pod="openstack/nova-api-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.461373 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/591cb000-dea7-4b70-a162-211974a0e8a8-logs\") pod \"nova-api-0\" (UID: \"591cb000-dea7-4b70-a162-211974a0e8a8\") " pod="openstack/nova-api-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.462295 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/591cb000-dea7-4b70-a162-211974a0e8a8-logs\") pod \"nova-api-0\" (UID: \"591cb000-dea7-4b70-a162-211974a0e8a8\") " pod="openstack/nova-api-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.462504 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.472663 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591cb000-dea7-4b70-a162-211974a0e8a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"591cb000-dea7-4b70-a162-211974a0e8a8\") " pod="openstack/nova-api-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.479049 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.485644 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591cb000-dea7-4b70-a162-211974a0e8a8-config-data\") pod \"nova-api-0\" (UID: \"591cb000-dea7-4b70-a162-211974a0e8a8\") " pod="openstack/nova-api-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.496126 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.511564 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrz4t\" (UniqueName: \"kubernetes.io/projected/591cb000-dea7-4b70-a162-211974a0e8a8-kube-api-access-jrz4t\") pod \"nova-api-0\" (UID: \"591cb000-dea7-4b70-a162-211974a0e8a8\") " pod="openstack/nova-api-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.540731 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.575338 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be193a0f-15ac-4a52-b1c3-2174a3ac1864-config-data\") pod \"nova-scheduler-0\" (UID: \"be193a0f-15ac-4a52-b1c3-2174a3ac1864\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.575510 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkqg5\" (UniqueName: \"kubernetes.io/projected/be193a0f-15ac-4a52-b1c3-2174a3ac1864-kube-api-access-kkqg5\") pod \"nova-scheduler-0\" (UID: \"be193a0f-15ac-4a52-b1c3-2174a3ac1864\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.575530 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be193a0f-15ac-4a52-b1c3-2174a3ac1864-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be193a0f-15ac-4a52-b1c3-2174a3ac1864\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.589171 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.591017 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.617142 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.621068 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.682712 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315a5ae8-7116-4d3d-8fa6-67d728959668-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"315a5ae8-7116-4d3d-8fa6-67d728959668\") " pod="openstack/nova-metadata-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.682783 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be193a0f-15ac-4a52-b1c3-2174a3ac1864-config-data\") pod \"nova-scheduler-0\" (UID: \"be193a0f-15ac-4a52-b1c3-2174a3ac1864\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.682968 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r852g\" (UniqueName: \"kubernetes.io/projected/315a5ae8-7116-4d3d-8fa6-67d728959668-kube-api-access-r852g\") pod \"nova-metadata-0\" (UID: \"315a5ae8-7116-4d3d-8fa6-67d728959668\") " pod="openstack/nova-metadata-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.682996 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315a5ae8-7116-4d3d-8fa6-67d728959668-config-data\") pod \"nova-metadata-0\" (UID: \"315a5ae8-7116-4d3d-8fa6-67d728959668\") " pod="openstack/nova-metadata-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.683042 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315a5ae8-7116-4d3d-8fa6-67d728959668-logs\") pod \"nova-metadata-0\" (UID: \"315a5ae8-7116-4d3d-8fa6-67d728959668\") " pod="openstack/nova-metadata-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.683079 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkqg5\" (UniqueName: \"kubernetes.io/projected/be193a0f-15ac-4a52-b1c3-2174a3ac1864-kube-api-access-kkqg5\") pod \"nova-scheduler-0\" (UID: \"be193a0f-15ac-4a52-b1c3-2174a3ac1864\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.683094 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be193a0f-15ac-4a52-b1c3-2174a3ac1864-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be193a0f-15ac-4a52-b1c3-2174a3ac1864\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.715798 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be193a0f-15ac-4a52-b1c3-2174a3ac1864-config-data\") pod \"nova-scheduler-0\" (UID: \"be193a0f-15ac-4a52-b1c3-2174a3ac1864\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.724205 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkqg5\" (UniqueName: \"kubernetes.io/projected/be193a0f-15ac-4a52-b1c3-2174a3ac1864-kube-api-access-kkqg5\") pod \"nova-scheduler-0\" (UID: \"be193a0f-15ac-4a52-b1c3-2174a3ac1864\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.725730 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be193a0f-15ac-4a52-b1c3-2174a3ac1864-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be193a0f-15ac-4a52-b1c3-2174a3ac1864\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.742511 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.743916 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.751365 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.784601 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315a5ae8-7116-4d3d-8fa6-67d728959668-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"315a5ae8-7116-4d3d-8fa6-67d728959668\") " pod="openstack/nova-metadata-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.784667 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93c495eb-7d30-4db8-8c2b-47410834e889-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"93c495eb-7d30-4db8-8c2b-47410834e889\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.784738 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93c495eb-7d30-4db8-8c2b-47410834e889-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"93c495eb-7d30-4db8-8c2b-47410834e889\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.784777 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsh8b\" (UniqueName: \"kubernetes.io/projected/93c495eb-7d30-4db8-8c2b-47410834e889-kube-api-access-qsh8b\") pod \"nova-cell1-novncproxy-0\" (UID: \"93c495eb-7d30-4db8-8c2b-47410834e889\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.784800 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r852g\" (UniqueName: \"kubernetes.io/projected/315a5ae8-7116-4d3d-8fa6-67d728959668-kube-api-access-r852g\") pod \"nova-metadata-0\" (UID: \"315a5ae8-7116-4d3d-8fa6-67d728959668\") " pod="openstack/nova-metadata-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.784826 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315a5ae8-7116-4d3d-8fa6-67d728959668-config-data\") pod \"nova-metadata-0\" (UID: \"315a5ae8-7116-4d3d-8fa6-67d728959668\") " pod="openstack/nova-metadata-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.784853 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315a5ae8-7116-4d3d-8fa6-67d728959668-logs\") pod \"nova-metadata-0\" (UID: \"315a5ae8-7116-4d3d-8fa6-67d728959668\") " pod="openstack/nova-metadata-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.785266 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315a5ae8-7116-4d3d-8fa6-67d728959668-logs\") pod \"nova-metadata-0\" (UID: \"315a5ae8-7116-4d3d-8fa6-67d728959668\") " pod="openstack/nova-metadata-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.800670 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-6tmts"] Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.803637 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.829943 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r852g\" (UniqueName: \"kubernetes.io/projected/315a5ae8-7116-4d3d-8fa6-67d728959668-kube-api-access-r852g\") pod \"nova-metadata-0\" (UID: \"315a5ae8-7116-4d3d-8fa6-67d728959668\") " pod="openstack/nova-metadata-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.830737 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.840565 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315a5ae8-7116-4d3d-8fa6-67d728959668-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"315a5ae8-7116-4d3d-8fa6-67d728959668\") " pod="openstack/nova-metadata-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.857005 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-6tmts"] Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.862029 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315a5ae8-7116-4d3d-8fa6-67d728959668-config-data\") pod \"nova-metadata-0\" (UID: \"315a5ae8-7116-4d3d-8fa6-67d728959668\") " pod="openstack/nova-metadata-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.862070 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.889975 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93c495eb-7d30-4db8-8c2b-47410834e889-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"93c495eb-7d30-4db8-8c2b-47410834e889\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.892446 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-6tmts\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.892472 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93c495eb-7d30-4db8-8c2b-47410834e889-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"93c495eb-7d30-4db8-8c2b-47410834e889\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.892502 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsh8b\" (UniqueName: \"kubernetes.io/projected/93c495eb-7d30-4db8-8c2b-47410834e889-kube-api-access-qsh8b\") pod \"nova-cell1-novncproxy-0\" (UID: \"93c495eb-7d30-4db8-8c2b-47410834e889\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.892528 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mf89\" (UniqueName: \"kubernetes.io/projected/f9a78807-ba42-4640-94be-a98bc08000a6-kube-api-access-7mf89\") pod \"dnsmasq-dns-9b86998b5-6tmts\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.892569 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-config\") pod \"dnsmasq-dns-9b86998b5-6tmts\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.892592 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-6tmts\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.892609 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-dns-svc\") pod \"dnsmasq-dns-9b86998b5-6tmts\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.892632 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-6tmts\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.899413 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93c495eb-7d30-4db8-8c2b-47410834e889-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"93c495eb-7d30-4db8-8c2b-47410834e889\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.915006 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsh8b\" (UniqueName: \"kubernetes.io/projected/93c495eb-7d30-4db8-8c2b-47410834e889-kube-api-access-qsh8b\") pod \"nova-cell1-novncproxy-0\" (UID: \"93c495eb-7d30-4db8-8c2b-47410834e889\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.915036 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93c495eb-7d30-4db8-8c2b-47410834e889-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"93c495eb-7d30-4db8-8c2b-47410834e889\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.994136 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-6tmts\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.995133 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-6tmts\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.995378 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mf89\" (UniqueName: \"kubernetes.io/projected/f9a78807-ba42-4640-94be-a98bc08000a6-kube-api-access-7mf89\") pod \"dnsmasq-dns-9b86998b5-6tmts\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.995549 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-config\") pod \"dnsmasq-dns-9b86998b5-6tmts\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.995621 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-6tmts\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.996202 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-dns-svc\") pod \"dnsmasq-dns-9b86998b5-6tmts\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.996262 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-6tmts\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.997784 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-6tmts\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.998151 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-6tmts\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.998163 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-config\") pod \"dnsmasq-dns-9b86998b5-6tmts\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:24 crc kubenswrapper[4775]: I1216 15:17:24.998956 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-dns-svc\") pod \"dnsmasq-dns-9b86998b5-6tmts\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.019533 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mf89\" (UniqueName: \"kubernetes.io/projected/f9a78807-ba42-4640-94be-a98bc08000a6-kube-api-access-7mf89\") pod \"dnsmasq-dns-9b86998b5-6tmts\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.027302 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.122252 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.152235 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.294964 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.322613 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-smpjc"] Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.522603 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jlnfn"] Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.526003 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jlnfn" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.529076 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.529099 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.533587 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jlnfn"] Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.590437 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:17:25 crc kubenswrapper[4775]: W1216 15:17:25.598040 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe193a0f_15ac_4a52_b1c3_2174a3ac1864.slice/crio-7b2562bf40534f00e31af0f74a4d1d2c95d19b10f3b321b6d6061b367094de01 WatchSource:0}: Error finding container 7b2562bf40534f00e31af0f74a4d1d2c95d19b10f3b321b6d6061b367094de01: Status 404 returned error can't find the container with id 7b2562bf40534f00e31af0f74a4d1d2c95d19b10f3b321b6d6061b367094de01 Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.617417 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jlnfn\" (UID: \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\") " pod="openstack/nova-cell1-conductor-db-sync-jlnfn" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.617458 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpqsq\" (UniqueName: \"kubernetes.io/projected/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-kube-api-access-vpqsq\") pod \"nova-cell1-conductor-db-sync-jlnfn\" (UID: \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\") " pod="openstack/nova-cell1-conductor-db-sync-jlnfn" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.617551 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-scripts\") pod \"nova-cell1-conductor-db-sync-jlnfn\" (UID: \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\") " pod="openstack/nova-cell1-conductor-db-sync-jlnfn" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.617618 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-config-data\") pod \"nova-cell1-conductor-db-sync-jlnfn\" (UID: \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\") " pod="openstack/nova-cell1-conductor-db-sync-jlnfn" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.720233 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-scripts\") pod \"nova-cell1-conductor-db-sync-jlnfn\" (UID: \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\") " pod="openstack/nova-cell1-conductor-db-sync-jlnfn" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.720470 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-config-data\") pod \"nova-cell1-conductor-db-sync-jlnfn\" (UID: \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\") " pod="openstack/nova-cell1-conductor-db-sync-jlnfn" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.720531 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jlnfn\" (UID: \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\") " pod="openstack/nova-cell1-conductor-db-sync-jlnfn" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.720582 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpqsq\" (UniqueName: \"kubernetes.io/projected/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-kube-api-access-vpqsq\") pod \"nova-cell1-conductor-db-sync-jlnfn\" (UID: \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\") " pod="openstack/nova-cell1-conductor-db-sync-jlnfn" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.729810 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-scripts\") pod \"nova-cell1-conductor-db-sync-jlnfn\" (UID: \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\") " pod="openstack/nova-cell1-conductor-db-sync-jlnfn" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.730275 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jlnfn\" (UID: \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\") " pod="openstack/nova-cell1-conductor-db-sync-jlnfn" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.730918 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-config-data\") pod \"nova-cell1-conductor-db-sync-jlnfn\" (UID: \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\") " pod="openstack/nova-cell1-conductor-db-sync-jlnfn" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.738715 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.748288 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpqsq\" (UniqueName: \"kubernetes.io/projected/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-kube-api-access-vpqsq\") pod \"nova-cell1-conductor-db-sync-jlnfn\" (UID: \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\") " pod="openstack/nova-cell1-conductor-db-sync-jlnfn" Dec 16 15:17:25 crc kubenswrapper[4775]: W1216 15:17:25.847042 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93c495eb_7d30_4db8_8c2b_47410834e889.slice/crio-11f2da6b343233b2f148600c572aaaafa2917a0acc06008279c9a0ed93199881 WatchSource:0}: Error finding container 11f2da6b343233b2f148600c572aaaafa2917a0acc06008279c9a0ed93199881: Status 404 returned error can't find the container with id 11f2da6b343233b2f148600c572aaaafa2917a0acc06008279c9a0ed93199881 Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.852966 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.866791 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jlnfn" Dec 16 15:17:25 crc kubenswrapper[4775]: I1216 15:17:25.900979 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-6tmts"] Dec 16 15:17:26 crc kubenswrapper[4775]: I1216 15:17:26.192442 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"93c495eb-7d30-4db8-8c2b-47410834e889","Type":"ContainerStarted","Data":"11f2da6b343233b2f148600c572aaaafa2917a0acc06008279c9a0ed93199881"} Dec 16 15:17:26 crc kubenswrapper[4775]: I1216 15:17:26.207068 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"591cb000-dea7-4b70-a162-211974a0e8a8","Type":"ContainerStarted","Data":"9ab2f27b05a0ce57c20cad41d8f9e67a7cc5b071dedf2e01620dd08ca9d64c92"} Dec 16 15:17:26 crc kubenswrapper[4775]: I1216 15:17:26.219457 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-smpjc" event={"ID":"46586ca7-367e-47d4-bd95-11037f7bb60f","Type":"ContainerStarted","Data":"e2dbca579561629389f9c4485b9f0fd64e8db5ddf0326341697d07388e8e4994"} Dec 16 15:17:26 crc kubenswrapper[4775]: I1216 15:17:26.219514 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-smpjc" event={"ID":"46586ca7-367e-47d4-bd95-11037f7bb60f","Type":"ContainerStarted","Data":"67ff49c7a208b7a048b501c307462d0cc2cac7f559c80247ecf6bec83ccbb1b7"} Dec 16 15:17:26 crc kubenswrapper[4775]: I1216 15:17:26.232231 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-6tmts" event={"ID":"f9a78807-ba42-4640-94be-a98bc08000a6","Type":"ContainerStarted","Data":"fe2c60dc29a4084cf5e04cc3e2c95243300464cd744c30848b14cf94af7eca09"} Dec 16 15:17:26 crc kubenswrapper[4775]: I1216 15:17:26.232282 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-6tmts" event={"ID":"f9a78807-ba42-4640-94be-a98bc08000a6","Type":"ContainerStarted","Data":"9ac916dcde6794e69e9329758abfcb551a1adad3c1fdcfac7d28a8b8524ea861"} Dec 16 15:17:26 crc kubenswrapper[4775]: I1216 15:17:26.246666 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be193a0f-15ac-4a52-b1c3-2174a3ac1864","Type":"ContainerStarted","Data":"7b2562bf40534f00e31af0f74a4d1d2c95d19b10f3b321b6d6061b367094de01"} Dec 16 15:17:26 crc kubenswrapper[4775]: I1216 15:17:26.250464 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-smpjc" podStartSLOduration=2.2504402519999998 podStartE2EDuration="2.250440252s" podCreationTimestamp="2025-12-16 15:17:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:17:26.244642465 +0000 UTC m=+1371.195721398" watchObservedRunningTime="2025-12-16 15:17:26.250440252 +0000 UTC m=+1371.201519175" Dec 16 15:17:26 crc kubenswrapper[4775]: I1216 15:17:26.266299 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"315a5ae8-7116-4d3d-8fa6-67d728959668","Type":"ContainerStarted","Data":"e11e9463641dad62aad4472584fd1849fede9bcbd7fa6b8c8154f54a7c117c57"} Dec 16 15:17:26 crc kubenswrapper[4775]: I1216 15:17:26.415560 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jlnfn"] Dec 16 15:17:27 crc kubenswrapper[4775]: I1216 15:17:27.328461 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jlnfn" event={"ID":"b63d44e7-07c5-48d1-bd00-6e8be2b2c889","Type":"ContainerStarted","Data":"3635d72ed63ee3edda4b94047de6593c62df3a0c20e58402bbce0c2173380187"} Dec 16 15:17:27 crc kubenswrapper[4775]: I1216 15:17:27.328519 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jlnfn" event={"ID":"b63d44e7-07c5-48d1-bd00-6e8be2b2c889","Type":"ContainerStarted","Data":"d2af24d8623dc8178a096ede4e16fc979af02c939a2e684282452d50315656a7"} Dec 16 15:17:27 crc kubenswrapper[4775]: I1216 15:17:27.346087 4775 generic.go:334] "Generic (PLEG): container finished" podID="f9a78807-ba42-4640-94be-a98bc08000a6" containerID="fe2c60dc29a4084cf5e04cc3e2c95243300464cd744c30848b14cf94af7eca09" exitCode=0 Dec 16 15:17:27 crc kubenswrapper[4775]: I1216 15:17:27.353075 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-jlnfn" podStartSLOduration=2.3530531630000002 podStartE2EDuration="2.353053163s" podCreationTimestamp="2025-12-16 15:17:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:17:27.349528811 +0000 UTC m=+1372.300607734" watchObservedRunningTime="2025-12-16 15:17:27.353053163 +0000 UTC m=+1372.304132086" Dec 16 15:17:27 crc kubenswrapper[4775]: I1216 15:17:27.356906 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:27 crc kubenswrapper[4775]: I1216 15:17:27.357039 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-6tmts" event={"ID":"f9a78807-ba42-4640-94be-a98bc08000a6","Type":"ContainerDied","Data":"fe2c60dc29a4084cf5e04cc3e2c95243300464cd744c30848b14cf94af7eca09"} Dec 16 15:17:27 crc kubenswrapper[4775]: I1216 15:17:27.357124 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-6tmts" event={"ID":"f9a78807-ba42-4640-94be-a98bc08000a6","Type":"ContainerStarted","Data":"a1274319190212b94cc34588b38f69ef4340276bcaadf212aefc0a4f4ab6e4ab"} Dec 16 15:17:27 crc kubenswrapper[4775]: I1216 15:17:27.375082 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9b86998b5-6tmts" podStartSLOduration=3.375065702 podStartE2EDuration="3.375065702s" podCreationTimestamp="2025-12-16 15:17:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:17:27.374164274 +0000 UTC m=+1372.325243197" watchObservedRunningTime="2025-12-16 15:17:27.375065702 +0000 UTC m=+1372.326144625" Dec 16 15:17:28 crc kubenswrapper[4775]: I1216 15:17:28.040520 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:28 crc kubenswrapper[4775]: I1216 15:17:28.050502 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 15:17:31 crc kubenswrapper[4775]: I1216 15:17:31.405350 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be193a0f-15ac-4a52-b1c3-2174a3ac1864","Type":"ContainerStarted","Data":"a0f0ce0fa15e11e29f3d55172cc152248d772df1a829809b3d72a28466f4495e"} Dec 16 15:17:31 crc kubenswrapper[4775]: I1216 15:17:31.410765 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"315a5ae8-7116-4d3d-8fa6-67d728959668","Type":"ContainerStarted","Data":"6ea5a2a8d3dae8425fe2df9d14c7afa1b7f9883543f91e503709a15ebf0f4a4b"} Dec 16 15:17:31 crc kubenswrapper[4775]: I1216 15:17:31.410850 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"315a5ae8-7116-4d3d-8fa6-67d728959668","Type":"ContainerStarted","Data":"56313e4000290288574aea138c07d3ff20348ead91ce616f7e47efc39715b998"} Dec 16 15:17:31 crc kubenswrapper[4775]: I1216 15:17:31.410924 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="315a5ae8-7116-4d3d-8fa6-67d728959668" containerName="nova-metadata-metadata" containerID="cri-o://6ea5a2a8d3dae8425fe2df9d14c7afa1b7f9883543f91e503709a15ebf0f4a4b" gracePeriod=30 Dec 16 15:17:31 crc kubenswrapper[4775]: I1216 15:17:31.410964 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="315a5ae8-7116-4d3d-8fa6-67d728959668" containerName="nova-metadata-log" containerID="cri-o://56313e4000290288574aea138c07d3ff20348ead91ce616f7e47efc39715b998" gracePeriod=30 Dec 16 15:17:31 crc kubenswrapper[4775]: I1216 15:17:31.413769 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"93c495eb-7d30-4db8-8c2b-47410834e889","Type":"ContainerStarted","Data":"b7b68e3a0262a8a8082c4920ba9713c9becf917dec8a90cb506ae627478b6193"} Dec 16 15:17:31 crc kubenswrapper[4775]: I1216 15:17:31.414314 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="93c495eb-7d30-4db8-8c2b-47410834e889" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b7b68e3a0262a8a8082c4920ba9713c9becf917dec8a90cb506ae627478b6193" gracePeriod=30 Dec 16 15:17:31 crc kubenswrapper[4775]: I1216 15:17:31.421922 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"591cb000-dea7-4b70-a162-211974a0e8a8","Type":"ContainerStarted","Data":"9bcaaec2564cb5ffff7898286e099b3dd73deb6bc52cd3d3dd77958648eb0768"} Dec 16 15:17:31 crc kubenswrapper[4775]: I1216 15:17:31.421981 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"591cb000-dea7-4b70-a162-211974a0e8a8","Type":"ContainerStarted","Data":"42f8e8607b50b5848bd7c3b9711fb75e4ff0c4b9673e45672b0426d5675c457e"} Dec 16 15:17:31 crc kubenswrapper[4775]: I1216 15:17:31.442867 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.087297121 podStartE2EDuration="7.442838629s" podCreationTimestamp="2025-12-16 15:17:24 +0000 UTC" firstStartedPulling="2025-12-16 15:17:25.600375179 +0000 UTC m=+1370.551454102" lastFinishedPulling="2025-12-16 15:17:29.955916687 +0000 UTC m=+1374.906995610" observedRunningTime="2025-12-16 15:17:31.430826873 +0000 UTC m=+1376.381905806" watchObservedRunningTime="2025-12-16 15:17:31.442838629 +0000 UTC m=+1376.393917552" Dec 16 15:17:31 crc kubenswrapper[4775]: I1216 15:17:31.464288 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.373383349 podStartE2EDuration="7.46426775s" podCreationTimestamp="2025-12-16 15:17:24 +0000 UTC" firstStartedPulling="2025-12-16 15:17:25.865739319 +0000 UTC m=+1370.816818242" lastFinishedPulling="2025-12-16 15:17:29.95662372 +0000 UTC m=+1374.907702643" observedRunningTime="2025-12-16 15:17:31.461411378 +0000 UTC m=+1376.412490331" watchObservedRunningTime="2025-12-16 15:17:31.46426775 +0000 UTC m=+1376.415346683" Dec 16 15:17:31 crc kubenswrapper[4775]: I1216 15:17:31.492843 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.85410007 podStartE2EDuration="7.49282346s" podCreationTimestamp="2025-12-16 15:17:24 +0000 UTC" firstStartedPulling="2025-12-16 15:17:25.321844007 +0000 UTC m=+1370.272922930" lastFinishedPulling="2025-12-16 15:17:29.960567397 +0000 UTC m=+1374.911646320" observedRunningTime="2025-12-16 15:17:31.483298323 +0000 UTC m=+1376.434377276" watchObservedRunningTime="2025-12-16 15:17:31.49282346 +0000 UTC m=+1376.443902383" Dec 16 15:17:31 crc kubenswrapper[4775]: I1216 15:17:31.515585 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.294290451 podStartE2EDuration="7.515560032s" podCreationTimestamp="2025-12-16 15:17:24 +0000 UTC" firstStartedPulling="2025-12-16 15:17:25.740659939 +0000 UTC m=+1370.691738862" lastFinishedPulling="2025-12-16 15:17:29.96192952 +0000 UTC m=+1374.913008443" observedRunningTime="2025-12-16 15:17:31.50368635 +0000 UTC m=+1376.454765273" watchObservedRunningTime="2025-12-16 15:17:31.515560032 +0000 UTC m=+1376.466638955" Dec 16 15:17:32 crc kubenswrapper[4775]: I1216 15:17:32.435810 4775 generic.go:334] "Generic (PLEG): container finished" podID="315a5ae8-7116-4d3d-8fa6-67d728959668" containerID="6ea5a2a8d3dae8425fe2df9d14c7afa1b7f9883543f91e503709a15ebf0f4a4b" exitCode=0 Dec 16 15:17:32 crc kubenswrapper[4775]: I1216 15:17:32.436312 4775 generic.go:334] "Generic (PLEG): container finished" podID="315a5ae8-7116-4d3d-8fa6-67d728959668" containerID="56313e4000290288574aea138c07d3ff20348ead91ce616f7e47efc39715b998" exitCode=143 Dec 16 15:17:32 crc kubenswrapper[4775]: I1216 15:17:32.435877 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"315a5ae8-7116-4d3d-8fa6-67d728959668","Type":"ContainerDied","Data":"6ea5a2a8d3dae8425fe2df9d14c7afa1b7f9883543f91e503709a15ebf0f4a4b"} Dec 16 15:17:32 crc kubenswrapper[4775]: I1216 15:17:32.436382 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"315a5ae8-7116-4d3d-8fa6-67d728959668","Type":"ContainerDied","Data":"56313e4000290288574aea138c07d3ff20348ead91ce616f7e47efc39715b998"} Dec 16 15:17:32 crc kubenswrapper[4775]: I1216 15:17:32.595856 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:17:32 crc kubenswrapper[4775]: I1216 15:17:32.710120 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315a5ae8-7116-4d3d-8fa6-67d728959668-combined-ca-bundle\") pod \"315a5ae8-7116-4d3d-8fa6-67d728959668\" (UID: \"315a5ae8-7116-4d3d-8fa6-67d728959668\") " Dec 16 15:17:32 crc kubenswrapper[4775]: I1216 15:17:32.710226 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315a5ae8-7116-4d3d-8fa6-67d728959668-config-data\") pod \"315a5ae8-7116-4d3d-8fa6-67d728959668\" (UID: \"315a5ae8-7116-4d3d-8fa6-67d728959668\") " Dec 16 15:17:32 crc kubenswrapper[4775]: I1216 15:17:32.710261 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r852g\" (UniqueName: \"kubernetes.io/projected/315a5ae8-7116-4d3d-8fa6-67d728959668-kube-api-access-r852g\") pod \"315a5ae8-7116-4d3d-8fa6-67d728959668\" (UID: \"315a5ae8-7116-4d3d-8fa6-67d728959668\") " Dec 16 15:17:32 crc kubenswrapper[4775]: I1216 15:17:32.710288 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315a5ae8-7116-4d3d-8fa6-67d728959668-logs\") pod \"315a5ae8-7116-4d3d-8fa6-67d728959668\" (UID: \"315a5ae8-7116-4d3d-8fa6-67d728959668\") " Dec 16 15:17:32 crc kubenswrapper[4775]: I1216 15:17:32.711058 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/315a5ae8-7116-4d3d-8fa6-67d728959668-logs" (OuterVolumeSpecName: "logs") pod "315a5ae8-7116-4d3d-8fa6-67d728959668" (UID: "315a5ae8-7116-4d3d-8fa6-67d728959668"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:17:32 crc kubenswrapper[4775]: I1216 15:17:32.717340 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/315a5ae8-7116-4d3d-8fa6-67d728959668-kube-api-access-r852g" (OuterVolumeSpecName: "kube-api-access-r852g") pod "315a5ae8-7116-4d3d-8fa6-67d728959668" (UID: "315a5ae8-7116-4d3d-8fa6-67d728959668"). InnerVolumeSpecName "kube-api-access-r852g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:32 crc kubenswrapper[4775]: I1216 15:17:32.745537 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315a5ae8-7116-4d3d-8fa6-67d728959668-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "315a5ae8-7116-4d3d-8fa6-67d728959668" (UID: "315a5ae8-7116-4d3d-8fa6-67d728959668"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:32 crc kubenswrapper[4775]: I1216 15:17:32.756096 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315a5ae8-7116-4d3d-8fa6-67d728959668-config-data" (OuterVolumeSpecName: "config-data") pod "315a5ae8-7116-4d3d-8fa6-67d728959668" (UID: "315a5ae8-7116-4d3d-8fa6-67d728959668"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:32 crc kubenswrapper[4775]: I1216 15:17:32.812651 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315a5ae8-7116-4d3d-8fa6-67d728959668-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:32 crc kubenswrapper[4775]: I1216 15:17:32.812689 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315a5ae8-7116-4d3d-8fa6-67d728959668-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:32 crc kubenswrapper[4775]: I1216 15:17:32.812703 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r852g\" (UniqueName: \"kubernetes.io/projected/315a5ae8-7116-4d3d-8fa6-67d728959668-kube-api-access-r852g\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:32 crc kubenswrapper[4775]: I1216 15:17:32.812717 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315a5ae8-7116-4d3d-8fa6-67d728959668-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.344166 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.446544 4775 generic.go:334] "Generic (PLEG): container finished" podID="be773756-aa1d-455c-9eac-a1e053636353" containerID="0ac4da1039738e526ecd22162234878767cee77f987b2bb7423bd0a75d7f1ae2" exitCode=137 Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.446609 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be773756-aa1d-455c-9eac-a1e053636353","Type":"ContainerDied","Data":"0ac4da1039738e526ecd22162234878767cee77f987b2bb7423bd0a75d7f1ae2"} Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.446646 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be773756-aa1d-455c-9eac-a1e053636353","Type":"ContainerDied","Data":"77c7f2322fbaecc969f73b162bcd7015d1c09c930a2517ff0d324b5af1bbb1af"} Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.446663 4775 scope.go:117] "RemoveContainer" containerID="0ac4da1039738e526ecd22162234878767cee77f987b2bb7423bd0a75d7f1ae2" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.446787 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.449677 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"315a5ae8-7116-4d3d-8fa6-67d728959668","Type":"ContainerDied","Data":"e11e9463641dad62aad4472584fd1849fede9bcbd7fa6b8c8154f54a7c117c57"} Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.449735 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.511125 4775 scope.go:117] "RemoveContainer" containerID="cd7da3692074ddb68f6c5adc48f0658e17871791e73ff2ebaa33f7ea44ba6208" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.518669 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.524149 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.528661 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trdk6\" (UniqueName: \"kubernetes.io/projected/be773756-aa1d-455c-9eac-a1e053636353-kube-api-access-trdk6\") pod \"be773756-aa1d-455c-9eac-a1e053636353\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.528981 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be773756-aa1d-455c-9eac-a1e053636353-log-httpd\") pod \"be773756-aa1d-455c-9eac-a1e053636353\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.529159 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-sg-core-conf-yaml\") pod \"be773756-aa1d-455c-9eac-a1e053636353\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.529299 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-scripts\") pod \"be773756-aa1d-455c-9eac-a1e053636353\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.529506 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be773756-aa1d-455c-9eac-a1e053636353-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "be773756-aa1d-455c-9eac-a1e053636353" (UID: "be773756-aa1d-455c-9eac-a1e053636353"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.529529 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be773756-aa1d-455c-9eac-a1e053636353-run-httpd\") pod \"be773756-aa1d-455c-9eac-a1e053636353\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.529651 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-combined-ca-bundle\") pod \"be773756-aa1d-455c-9eac-a1e053636353\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.529689 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-config-data\") pod \"be773756-aa1d-455c-9eac-a1e053636353\" (UID: \"be773756-aa1d-455c-9eac-a1e053636353\") " Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.530404 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be773756-aa1d-455c-9eac-a1e053636353-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "be773756-aa1d-455c-9eac-a1e053636353" (UID: "be773756-aa1d-455c-9eac-a1e053636353"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.531270 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be773756-aa1d-455c-9eac-a1e053636353-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.531313 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be773756-aa1d-455c-9eac-a1e053636353-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.541181 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be773756-aa1d-455c-9eac-a1e053636353-kube-api-access-trdk6" (OuterVolumeSpecName: "kube-api-access-trdk6") pod "be773756-aa1d-455c-9eac-a1e053636353" (UID: "be773756-aa1d-455c-9eac-a1e053636353"). InnerVolumeSpecName "kube-api-access-trdk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.543220 4775 scope.go:117] "RemoveContainer" containerID="1d06a29ad3e9a206ac4fcd976a399f4eceda6be49b0d943a0c0f3159fddb6c91" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.567773 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:33 crc kubenswrapper[4775]: E1216 15:17:33.569167 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be773756-aa1d-455c-9eac-a1e053636353" containerName="ceilometer-central-agent" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.569195 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="be773756-aa1d-455c-9eac-a1e053636353" containerName="ceilometer-central-agent" Dec 16 15:17:33 crc kubenswrapper[4775]: E1216 15:17:33.569206 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be773756-aa1d-455c-9eac-a1e053636353" containerName="proxy-httpd" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.569212 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="be773756-aa1d-455c-9eac-a1e053636353" containerName="proxy-httpd" Dec 16 15:17:33 crc kubenswrapper[4775]: E1216 15:17:33.569426 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315a5ae8-7116-4d3d-8fa6-67d728959668" containerName="nova-metadata-metadata" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.569433 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="315a5ae8-7116-4d3d-8fa6-67d728959668" containerName="nova-metadata-metadata" Dec 16 15:17:33 crc kubenswrapper[4775]: E1216 15:17:33.569452 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315a5ae8-7116-4d3d-8fa6-67d728959668" containerName="nova-metadata-log" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.569458 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="315a5ae8-7116-4d3d-8fa6-67d728959668" containerName="nova-metadata-log" Dec 16 15:17:33 crc kubenswrapper[4775]: E1216 15:17:33.569477 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be773756-aa1d-455c-9eac-a1e053636353" containerName="sg-core" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.569483 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="be773756-aa1d-455c-9eac-a1e053636353" containerName="sg-core" Dec 16 15:17:33 crc kubenswrapper[4775]: E1216 15:17:33.569491 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be773756-aa1d-455c-9eac-a1e053636353" containerName="ceilometer-notification-agent" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.569498 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="be773756-aa1d-455c-9eac-a1e053636353" containerName="ceilometer-notification-agent" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.569678 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="be773756-aa1d-455c-9eac-a1e053636353" containerName="ceilometer-central-agent" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.569687 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="315a5ae8-7116-4d3d-8fa6-67d728959668" containerName="nova-metadata-metadata" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.569700 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="be773756-aa1d-455c-9eac-a1e053636353" containerName="ceilometer-notification-agent" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.569714 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="be773756-aa1d-455c-9eac-a1e053636353" containerName="sg-core" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.569725 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="be773756-aa1d-455c-9eac-a1e053636353" containerName="proxy-httpd" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.569735 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="315a5ae8-7116-4d3d-8fa6-67d728959668" containerName="nova-metadata-log" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.571093 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.573159 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-scripts" (OuterVolumeSpecName: "scripts") pod "be773756-aa1d-455c-9eac-a1e053636353" (UID: "be773756-aa1d-455c-9eac-a1e053636353"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.576673 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.577144 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.581858 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.608267 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "be773756-aa1d-455c-9eac-a1e053636353" (UID: "be773756-aa1d-455c-9eac-a1e053636353"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.633411 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cb4276f-72bc-4392-87bb-6a71b0e2233f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " pod="openstack/nova-metadata-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.633545 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cb4276f-72bc-4392-87bb-6a71b0e2233f-config-data\") pod \"nova-metadata-0\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " pod="openstack/nova-metadata-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.633583 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb4276f-72bc-4392-87bb-6a71b0e2233f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " pod="openstack/nova-metadata-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.633633 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzp5j\" (UniqueName: \"kubernetes.io/projected/6cb4276f-72bc-4392-87bb-6a71b0e2233f-kube-api-access-rzp5j\") pod \"nova-metadata-0\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " pod="openstack/nova-metadata-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.633699 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cb4276f-72bc-4392-87bb-6a71b0e2233f-logs\") pod \"nova-metadata-0\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " pod="openstack/nova-metadata-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.633766 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.633778 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.633792 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trdk6\" (UniqueName: \"kubernetes.io/projected/be773756-aa1d-455c-9eac-a1e053636353-kube-api-access-trdk6\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.678880 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be773756-aa1d-455c-9eac-a1e053636353" (UID: "be773756-aa1d-455c-9eac-a1e053636353"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.687221 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-config-data" (OuterVolumeSpecName: "config-data") pod "be773756-aa1d-455c-9eac-a1e053636353" (UID: "be773756-aa1d-455c-9eac-a1e053636353"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.736466 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cb4276f-72bc-4392-87bb-6a71b0e2233f-logs\") pod \"nova-metadata-0\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " pod="openstack/nova-metadata-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.736589 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cb4276f-72bc-4392-87bb-6a71b0e2233f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " pod="openstack/nova-metadata-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.736675 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cb4276f-72bc-4392-87bb-6a71b0e2233f-config-data\") pod \"nova-metadata-0\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " pod="openstack/nova-metadata-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.736704 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb4276f-72bc-4392-87bb-6a71b0e2233f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " pod="openstack/nova-metadata-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.736750 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzp5j\" (UniqueName: \"kubernetes.io/projected/6cb4276f-72bc-4392-87bb-6a71b0e2233f-kube-api-access-rzp5j\") pod \"nova-metadata-0\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " pod="openstack/nova-metadata-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.736804 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.736818 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be773756-aa1d-455c-9eac-a1e053636353-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.737629 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cb4276f-72bc-4392-87bb-6a71b0e2233f-logs\") pod \"nova-metadata-0\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " pod="openstack/nova-metadata-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.741147 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cb4276f-72bc-4392-87bb-6a71b0e2233f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " pod="openstack/nova-metadata-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.742740 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cb4276f-72bc-4392-87bb-6a71b0e2233f-config-data\") pod \"nova-metadata-0\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " pod="openstack/nova-metadata-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.749947 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb4276f-72bc-4392-87bb-6a71b0e2233f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " pod="openstack/nova-metadata-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.767007 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzp5j\" (UniqueName: \"kubernetes.io/projected/6cb4276f-72bc-4392-87bb-6a71b0e2233f-kube-api-access-rzp5j\") pod \"nova-metadata-0\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " pod="openstack/nova-metadata-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.767256 4775 scope.go:117] "RemoveContainer" containerID="1d4864043409240ba3a3333e1ed0ee7ee35781b5d743f5934ebffdc81145e610" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.796077 4775 scope.go:117] "RemoveContainer" containerID="0ac4da1039738e526ecd22162234878767cee77f987b2bb7423bd0a75d7f1ae2" Dec 16 15:17:33 crc kubenswrapper[4775]: E1216 15:17:33.804053 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac4da1039738e526ecd22162234878767cee77f987b2bb7423bd0a75d7f1ae2\": container with ID starting with 0ac4da1039738e526ecd22162234878767cee77f987b2bb7423bd0a75d7f1ae2 not found: ID does not exist" containerID="0ac4da1039738e526ecd22162234878767cee77f987b2bb7423bd0a75d7f1ae2" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.804111 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac4da1039738e526ecd22162234878767cee77f987b2bb7423bd0a75d7f1ae2"} err="failed to get container status \"0ac4da1039738e526ecd22162234878767cee77f987b2bb7423bd0a75d7f1ae2\": rpc error: code = NotFound desc = could not find container \"0ac4da1039738e526ecd22162234878767cee77f987b2bb7423bd0a75d7f1ae2\": container with ID starting with 0ac4da1039738e526ecd22162234878767cee77f987b2bb7423bd0a75d7f1ae2 not found: ID does not exist" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.804143 4775 scope.go:117] "RemoveContainer" containerID="cd7da3692074ddb68f6c5adc48f0658e17871791e73ff2ebaa33f7ea44ba6208" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.804242 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:17:33 crc kubenswrapper[4775]: E1216 15:17:33.805047 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd7da3692074ddb68f6c5adc48f0658e17871791e73ff2ebaa33f7ea44ba6208\": container with ID starting with cd7da3692074ddb68f6c5adc48f0658e17871791e73ff2ebaa33f7ea44ba6208 not found: ID does not exist" containerID="cd7da3692074ddb68f6c5adc48f0658e17871791e73ff2ebaa33f7ea44ba6208" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.805110 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd7da3692074ddb68f6c5adc48f0658e17871791e73ff2ebaa33f7ea44ba6208"} err="failed to get container status \"cd7da3692074ddb68f6c5adc48f0658e17871791e73ff2ebaa33f7ea44ba6208\": rpc error: code = NotFound desc = could not find container \"cd7da3692074ddb68f6c5adc48f0658e17871791e73ff2ebaa33f7ea44ba6208\": container with ID starting with cd7da3692074ddb68f6c5adc48f0658e17871791e73ff2ebaa33f7ea44ba6208 not found: ID does not exist" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.805146 4775 scope.go:117] "RemoveContainer" containerID="1d06a29ad3e9a206ac4fcd976a399f4eceda6be49b0d943a0c0f3159fddb6c91" Dec 16 15:17:33 crc kubenswrapper[4775]: E1216 15:17:33.808024 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d06a29ad3e9a206ac4fcd976a399f4eceda6be49b0d943a0c0f3159fddb6c91\": container with ID starting with 1d06a29ad3e9a206ac4fcd976a399f4eceda6be49b0d943a0c0f3159fddb6c91 not found: ID does not exist" containerID="1d06a29ad3e9a206ac4fcd976a399f4eceda6be49b0d943a0c0f3159fddb6c91" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.808068 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d06a29ad3e9a206ac4fcd976a399f4eceda6be49b0d943a0c0f3159fddb6c91"} err="failed to get container status \"1d06a29ad3e9a206ac4fcd976a399f4eceda6be49b0d943a0c0f3159fddb6c91\": rpc error: code = NotFound desc = could not find container \"1d06a29ad3e9a206ac4fcd976a399f4eceda6be49b0d943a0c0f3159fddb6c91\": container with ID starting with 1d06a29ad3e9a206ac4fcd976a399f4eceda6be49b0d943a0c0f3159fddb6c91 not found: ID does not exist" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.808102 4775 scope.go:117] "RemoveContainer" containerID="1d4864043409240ba3a3333e1ed0ee7ee35781b5d743f5934ebffdc81145e610" Dec 16 15:17:33 crc kubenswrapper[4775]: E1216 15:17:33.808984 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d4864043409240ba3a3333e1ed0ee7ee35781b5d743f5934ebffdc81145e610\": container with ID starting with 1d4864043409240ba3a3333e1ed0ee7ee35781b5d743f5934ebffdc81145e610 not found: ID does not exist" containerID="1d4864043409240ba3a3333e1ed0ee7ee35781b5d743f5934ebffdc81145e610" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.809008 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d4864043409240ba3a3333e1ed0ee7ee35781b5d743f5934ebffdc81145e610"} err="failed to get container status \"1d4864043409240ba3a3333e1ed0ee7ee35781b5d743f5934ebffdc81145e610\": rpc error: code = NotFound desc = could not find container \"1d4864043409240ba3a3333e1ed0ee7ee35781b5d743f5934ebffdc81145e610\": container with ID starting with 1d4864043409240ba3a3333e1ed0ee7ee35781b5d743f5934ebffdc81145e610 not found: ID does not exist" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.809026 4775 scope.go:117] "RemoveContainer" containerID="6ea5a2a8d3dae8425fe2df9d14c7afa1b7f9883543f91e503709a15ebf0f4a4b" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.816431 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.858057 4775 scope.go:117] "RemoveContainer" containerID="56313e4000290288574aea138c07d3ff20348ead91ce616f7e47efc39715b998" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.865939 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.868743 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.874350 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.874578 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 15:17:33 crc kubenswrapper[4775]: I1216 15:17:33.877227 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.049057 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc6d50e-df14-4345-8522-7b3f41d8b956-log-httpd\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.049092 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.049115 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc6d50e-df14-4345-8522-7b3f41d8b956-run-httpd\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.049131 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-config-data\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.049516 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-799dw\" (UniqueName: \"kubernetes.io/projected/5dc6d50e-df14-4345-8522-7b3f41d8b956-kube-api-access-799dw\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.050016 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.050218 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-scripts\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.062135 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.152144 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-799dw\" (UniqueName: \"kubernetes.io/projected/5dc6d50e-df14-4345-8522-7b3f41d8b956-kube-api-access-799dw\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.152552 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.152659 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-scripts\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.152794 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.152828 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc6d50e-df14-4345-8522-7b3f41d8b956-log-httpd\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.152915 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc6d50e-df14-4345-8522-7b3f41d8b956-run-httpd\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.152944 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-config-data\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.153776 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc6d50e-df14-4345-8522-7b3f41d8b956-log-httpd\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.154488 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc6d50e-df14-4345-8522-7b3f41d8b956-run-httpd\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.160045 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-config-data\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.160414 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.160977 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-scripts\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.161622 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.172907 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-799dw\" (UniqueName: \"kubernetes.io/projected/5dc6d50e-df14-4345-8522-7b3f41d8b956-kube-api-access-799dw\") pod \"ceilometer-0\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.232745 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.521968 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.541496 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.541598 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 15:17:34 crc kubenswrapper[4775]: W1216 15:17:34.587477 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cb4276f_72bc_4392_87bb_6a71b0e2233f.slice/crio-856bb239db175b1b43aafa525babecb1399db4ad8e36b182a25a1f38c14b3d87 WatchSource:0}: Error finding container 856bb239db175b1b43aafa525babecb1399db4ad8e36b182a25a1f38c14b3d87: Status 404 returned error can't find the container with id 856bb239db175b1b43aafa525babecb1399db4ad8e36b182a25a1f38c14b3d87 Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.668739 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:17:34 crc kubenswrapper[4775]: W1216 15:17:34.668922 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dc6d50e_df14_4345_8522_7b3f41d8b956.slice/crio-7b179467391399b48434c3fbdda1e90c1a0a0e4b63d2315abe9619600442e05c WatchSource:0}: Error finding container 7b179467391399b48434c3fbdda1e90c1a0a0e4b63d2315abe9619600442e05c: Status 404 returned error can't find the container with id 7b179467391399b48434c3fbdda1e90c1a0a0e4b63d2315abe9619600442e05c Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.672618 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.863370 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.863411 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 15:17:34 crc kubenswrapper[4775]: I1216 15:17:34.920259 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.123460 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.154059 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.257359 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-p8zvh"] Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.257601 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" podUID="799c9224-8212-452d-83c5-238ad4a6ed31" containerName="dnsmasq-dns" containerID="cri-o://712d3f3ed3b67e20fc681276a21bff051e6251dfcf09c0306990f18602c2fa8a" gracePeriod=10 Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.381229 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="315a5ae8-7116-4d3d-8fa6-67d728959668" path="/var/lib/kubelet/pods/315a5ae8-7116-4d3d-8fa6-67d728959668/volumes" Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.381831 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be773756-aa1d-455c-9eac-a1e053636353" path="/var/lib/kubelet/pods/be773756-aa1d-455c-9eac-a1e053636353/volumes" Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.481846 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc6d50e-df14-4345-8522-7b3f41d8b956","Type":"ContainerStarted","Data":"7b179467391399b48434c3fbdda1e90c1a0a0e4b63d2315abe9619600442e05c"} Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.483522 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6cb4276f-72bc-4392-87bb-6a71b0e2233f","Type":"ContainerStarted","Data":"e55f9f8c4db13ad6cf41d4dc33e3ed3f10ac151657bd18616514003a2b7b4ebc"} Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.483551 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6cb4276f-72bc-4392-87bb-6a71b0e2233f","Type":"ContainerStarted","Data":"635ec92e349a8077672ad5137601620286ac78c372a210d71c8d9429d702ac8e"} Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.483560 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6cb4276f-72bc-4392-87bb-6a71b0e2233f","Type":"ContainerStarted","Data":"856bb239db175b1b43aafa525babecb1399db4ad8e36b182a25a1f38c14b3d87"} Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.492150 4775 generic.go:334] "Generic (PLEG): container finished" podID="799c9224-8212-452d-83c5-238ad4a6ed31" containerID="712d3f3ed3b67e20fc681276a21bff051e6251dfcf09c0306990f18602c2fa8a" exitCode=0 Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.492267 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" event={"ID":"799c9224-8212-452d-83c5-238ad4a6ed31","Type":"ContainerDied","Data":"712d3f3ed3b67e20fc681276a21bff051e6251dfcf09c0306990f18602c2fa8a"} Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.505446 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-smpjc" event={"ID":"46586ca7-367e-47d4-bd95-11037f7bb60f","Type":"ContainerDied","Data":"e2dbca579561629389f9c4485b9f0fd64e8db5ddf0326341697d07388e8e4994"} Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.505380 4775 generic.go:334] "Generic (PLEG): container finished" podID="46586ca7-367e-47d4-bd95-11037f7bb60f" containerID="e2dbca579561629389f9c4485b9f0fd64e8db5ddf0326341697d07388e8e4994" exitCode=0 Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.551759 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.551739061 podStartE2EDuration="2.551739061s" podCreationTimestamp="2025-12-16 15:17:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:17:35.521938791 +0000 UTC m=+1380.473017734" watchObservedRunningTime="2025-12-16 15:17:35.551739061 +0000 UTC m=+1380.502817984" Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.552161 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.626149 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="591cb000-dea7-4b70-a162-211974a0e8a8" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.626658 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="591cb000-dea7-4b70-a162-211974a0e8a8" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.884519 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.893240 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-ovsdbserver-nb\") pod \"799c9224-8212-452d-83c5-238ad4a6ed31\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.893279 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghbsx\" (UniqueName: \"kubernetes.io/projected/799c9224-8212-452d-83c5-238ad4a6ed31-kube-api-access-ghbsx\") pod \"799c9224-8212-452d-83c5-238ad4a6ed31\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.893362 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-config\") pod \"799c9224-8212-452d-83c5-238ad4a6ed31\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.893389 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-ovsdbserver-sb\") pod \"799c9224-8212-452d-83c5-238ad4a6ed31\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.893454 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-dns-svc\") pod \"799c9224-8212-452d-83c5-238ad4a6ed31\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.893491 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-dns-swift-storage-0\") pod \"799c9224-8212-452d-83c5-238ad4a6ed31\" (UID: \"799c9224-8212-452d-83c5-238ad4a6ed31\") " Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.901505 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/799c9224-8212-452d-83c5-238ad4a6ed31-kube-api-access-ghbsx" (OuterVolumeSpecName: "kube-api-access-ghbsx") pod "799c9224-8212-452d-83c5-238ad4a6ed31" (UID: "799c9224-8212-452d-83c5-238ad4a6ed31"). InnerVolumeSpecName "kube-api-access-ghbsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.974803 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "799c9224-8212-452d-83c5-238ad4a6ed31" (UID: "799c9224-8212-452d-83c5-238ad4a6ed31"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.986690 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-config" (OuterVolumeSpecName: "config") pod "799c9224-8212-452d-83c5-238ad4a6ed31" (UID: "799c9224-8212-452d-83c5-238ad4a6ed31"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.995169 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghbsx\" (UniqueName: \"kubernetes.io/projected/799c9224-8212-452d-83c5-238ad4a6ed31-kube-api-access-ghbsx\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.995346 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:35 crc kubenswrapper[4775]: I1216 15:17:35.995518 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:36 crc kubenswrapper[4775]: I1216 15:17:36.000039 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "799c9224-8212-452d-83c5-238ad4a6ed31" (UID: "799c9224-8212-452d-83c5-238ad4a6ed31"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:17:36 crc kubenswrapper[4775]: I1216 15:17:36.000158 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "799c9224-8212-452d-83c5-238ad4a6ed31" (UID: "799c9224-8212-452d-83c5-238ad4a6ed31"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:17:36 crc kubenswrapper[4775]: I1216 15:17:36.027612 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "799c9224-8212-452d-83c5-238ad4a6ed31" (UID: "799c9224-8212-452d-83c5-238ad4a6ed31"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:17:36 crc kubenswrapper[4775]: I1216 15:17:36.098480 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:36 crc kubenswrapper[4775]: I1216 15:17:36.098515 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:36 crc kubenswrapper[4775]: I1216 15:17:36.098527 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/799c9224-8212-452d-83c5-238ad4a6ed31-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:36 crc kubenswrapper[4775]: I1216 15:17:36.525132 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc6d50e-df14-4345-8522-7b3f41d8b956","Type":"ContainerStarted","Data":"abd091ea75c441b1b32f3067d5db38cc99820ebfe68250bc5f3dfd39608c940d"} Dec 16 15:17:36 crc kubenswrapper[4775]: I1216 15:17:36.527603 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" event={"ID":"799c9224-8212-452d-83c5-238ad4a6ed31","Type":"ContainerDied","Data":"baee6b31a9b4d609a7154aded2bf57012de69a50776c41b81d31212166be1423"} Dec 16 15:17:36 crc kubenswrapper[4775]: I1216 15:17:36.527676 4775 scope.go:117] "RemoveContainer" containerID="712d3f3ed3b67e20fc681276a21bff051e6251dfcf09c0306990f18602c2fa8a" Dec 16 15:17:36 crc kubenswrapper[4775]: I1216 15:17:36.527926 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-p8zvh" Dec 16 15:17:36 crc kubenswrapper[4775]: I1216 15:17:36.558685 4775 scope.go:117] "RemoveContainer" containerID="90d5598e6ff11e8b0a2b85ff321a12c25134c57df5752ad720223a7e460952fd" Dec 16 15:17:36 crc kubenswrapper[4775]: I1216 15:17:36.583952 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-p8zvh"] Dec 16 15:17:36 crc kubenswrapper[4775]: I1216 15:17:36.600404 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-p8zvh"] Dec 16 15:17:36 crc kubenswrapper[4775]: I1216 15:17:36.945103 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-smpjc" Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.118238 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46586ca7-367e-47d4-bd95-11037f7bb60f-combined-ca-bundle\") pod \"46586ca7-367e-47d4-bd95-11037f7bb60f\" (UID: \"46586ca7-367e-47d4-bd95-11037f7bb60f\") " Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.118319 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46586ca7-367e-47d4-bd95-11037f7bb60f-scripts\") pod \"46586ca7-367e-47d4-bd95-11037f7bb60f\" (UID: \"46586ca7-367e-47d4-bd95-11037f7bb60f\") " Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.118418 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46586ca7-367e-47d4-bd95-11037f7bb60f-config-data\") pod \"46586ca7-367e-47d4-bd95-11037f7bb60f\" (UID: \"46586ca7-367e-47d4-bd95-11037f7bb60f\") " Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.118633 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw6mj\" (UniqueName: \"kubernetes.io/projected/46586ca7-367e-47d4-bd95-11037f7bb60f-kube-api-access-lw6mj\") pod \"46586ca7-367e-47d4-bd95-11037f7bb60f\" (UID: \"46586ca7-367e-47d4-bd95-11037f7bb60f\") " Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.155532 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46586ca7-367e-47d4-bd95-11037f7bb60f-scripts" (OuterVolumeSpecName: "scripts") pod "46586ca7-367e-47d4-bd95-11037f7bb60f" (UID: "46586ca7-367e-47d4-bd95-11037f7bb60f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.155718 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46586ca7-367e-47d4-bd95-11037f7bb60f-kube-api-access-lw6mj" (OuterVolumeSpecName: "kube-api-access-lw6mj") pod "46586ca7-367e-47d4-bd95-11037f7bb60f" (UID: "46586ca7-367e-47d4-bd95-11037f7bb60f"). InnerVolumeSpecName "kube-api-access-lw6mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.169211 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46586ca7-367e-47d4-bd95-11037f7bb60f-config-data" (OuterVolumeSpecName: "config-data") pod "46586ca7-367e-47d4-bd95-11037f7bb60f" (UID: "46586ca7-367e-47d4-bd95-11037f7bb60f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.185858 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46586ca7-367e-47d4-bd95-11037f7bb60f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46586ca7-367e-47d4-bd95-11037f7bb60f" (UID: "46586ca7-367e-47d4-bd95-11037f7bb60f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.221753 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46586ca7-367e-47d4-bd95-11037f7bb60f-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.221793 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw6mj\" (UniqueName: \"kubernetes.io/projected/46586ca7-367e-47d4-bd95-11037f7bb60f-kube-api-access-lw6mj\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.221805 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46586ca7-367e-47d4-bd95-11037f7bb60f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.221816 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46586ca7-367e-47d4-bd95-11037f7bb60f-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.356799 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="799c9224-8212-452d-83c5-238ad4a6ed31" path="/var/lib/kubelet/pods/799c9224-8212-452d-83c5-238ad4a6ed31/volumes" Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.542472 4775 generic.go:334] "Generic (PLEG): container finished" podID="b63d44e7-07c5-48d1-bd00-6e8be2b2c889" containerID="3635d72ed63ee3edda4b94047de6593c62df3a0c20e58402bbce0c2173380187" exitCode=0 Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.542561 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jlnfn" event={"ID":"b63d44e7-07c5-48d1-bd00-6e8be2b2c889","Type":"ContainerDied","Data":"3635d72ed63ee3edda4b94047de6593c62df3a0c20e58402bbce0c2173380187"} Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.546370 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-smpjc" event={"ID":"46586ca7-367e-47d4-bd95-11037f7bb60f","Type":"ContainerDied","Data":"67ff49c7a208b7a048b501c307462d0cc2cac7f559c80247ecf6bec83ccbb1b7"} Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.546399 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67ff49c7a208b7a048b501c307462d0cc2cac7f559c80247ecf6bec83ccbb1b7" Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.546491 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-smpjc" Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.652188 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.652661 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="591cb000-dea7-4b70-a162-211974a0e8a8" containerName="nova-api-api" containerID="cri-o://9bcaaec2564cb5ffff7898286e099b3dd73deb6bc52cd3d3dd77958648eb0768" gracePeriod=30 Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.652995 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="591cb000-dea7-4b70-a162-211974a0e8a8" containerName="nova-api-log" containerID="cri-o://42f8e8607b50b5848bd7c3b9711fb75e4ff0c4b9673e45672b0426d5675c457e" gracePeriod=30 Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.664362 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.664787 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="be193a0f-15ac-4a52-b1c3-2174a3ac1864" containerName="nova-scheduler-scheduler" containerID="cri-o://a0f0ce0fa15e11e29f3d55172cc152248d772df1a829809b3d72a28466f4495e" gracePeriod=30 Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.726931 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.727702 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6cb4276f-72bc-4392-87bb-6a71b0e2233f" containerName="nova-metadata-metadata" containerID="cri-o://e55f9f8c4db13ad6cf41d4dc33e3ed3f10ac151657bd18616514003a2b7b4ebc" gracePeriod=30 Dec 16 15:17:37 crc kubenswrapper[4775]: I1216 15:17:37.727656 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6cb4276f-72bc-4392-87bb-6a71b0e2233f" containerName="nova-metadata-log" containerID="cri-o://635ec92e349a8077672ad5137601620286ac78c372a210d71c8d9429d702ac8e" gracePeriod=30 Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.358905 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.441046 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cb4276f-72bc-4392-87bb-6a71b0e2233f-config-data\") pod \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.441258 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cb4276f-72bc-4392-87bb-6a71b0e2233f-combined-ca-bundle\") pod \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.441298 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzp5j\" (UniqueName: \"kubernetes.io/projected/6cb4276f-72bc-4392-87bb-6a71b0e2233f-kube-api-access-rzp5j\") pod \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.441335 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb4276f-72bc-4392-87bb-6a71b0e2233f-nova-metadata-tls-certs\") pod \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.441383 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cb4276f-72bc-4392-87bb-6a71b0e2233f-logs\") pod \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\" (UID: \"6cb4276f-72bc-4392-87bb-6a71b0e2233f\") " Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.442479 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cb4276f-72bc-4392-87bb-6a71b0e2233f-logs" (OuterVolumeSpecName: "logs") pod "6cb4276f-72bc-4392-87bb-6a71b0e2233f" (UID: "6cb4276f-72bc-4392-87bb-6a71b0e2233f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.442852 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cb4276f-72bc-4392-87bb-6a71b0e2233f-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.445866 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cb4276f-72bc-4392-87bb-6a71b0e2233f-kube-api-access-rzp5j" (OuterVolumeSpecName: "kube-api-access-rzp5j") pod "6cb4276f-72bc-4392-87bb-6a71b0e2233f" (UID: "6cb4276f-72bc-4392-87bb-6a71b0e2233f"). InnerVolumeSpecName "kube-api-access-rzp5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.477137 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cb4276f-72bc-4392-87bb-6a71b0e2233f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cb4276f-72bc-4392-87bb-6a71b0e2233f" (UID: "6cb4276f-72bc-4392-87bb-6a71b0e2233f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.493980 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cb4276f-72bc-4392-87bb-6a71b0e2233f-config-data" (OuterVolumeSpecName: "config-data") pod "6cb4276f-72bc-4392-87bb-6a71b0e2233f" (UID: "6cb4276f-72bc-4392-87bb-6a71b0e2233f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.529264 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cb4276f-72bc-4392-87bb-6a71b0e2233f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6cb4276f-72bc-4392-87bb-6a71b0e2233f" (UID: "6cb4276f-72bc-4392-87bb-6a71b0e2233f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.544965 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cb4276f-72bc-4392-87bb-6a71b0e2233f-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.545001 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cb4276f-72bc-4392-87bb-6a71b0e2233f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.545014 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzp5j\" (UniqueName: \"kubernetes.io/projected/6cb4276f-72bc-4392-87bb-6a71b0e2233f-kube-api-access-rzp5j\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.545025 4775 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb4276f-72bc-4392-87bb-6a71b0e2233f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.558041 4775 generic.go:334] "Generic (PLEG): container finished" podID="591cb000-dea7-4b70-a162-211974a0e8a8" containerID="42f8e8607b50b5848bd7c3b9711fb75e4ff0c4b9673e45672b0426d5675c457e" exitCode=143 Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.558103 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"591cb000-dea7-4b70-a162-211974a0e8a8","Type":"ContainerDied","Data":"42f8e8607b50b5848bd7c3b9711fb75e4ff0c4b9673e45672b0426d5675c457e"} Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.560337 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc6d50e-df14-4345-8522-7b3f41d8b956","Type":"ContainerStarted","Data":"3b0eeed3eb498d88f58a831fd428e32689aed36255274f4595928f2573627371"} Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.567409 4775 generic.go:334] "Generic (PLEG): container finished" podID="6cb4276f-72bc-4392-87bb-6a71b0e2233f" containerID="e55f9f8c4db13ad6cf41d4dc33e3ed3f10ac151657bd18616514003a2b7b4ebc" exitCode=0 Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.567475 4775 generic.go:334] "Generic (PLEG): container finished" podID="6cb4276f-72bc-4392-87bb-6a71b0e2233f" containerID="635ec92e349a8077672ad5137601620286ac78c372a210d71c8d9429d702ac8e" exitCode=143 Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.567487 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.567499 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6cb4276f-72bc-4392-87bb-6a71b0e2233f","Type":"ContainerDied","Data":"e55f9f8c4db13ad6cf41d4dc33e3ed3f10ac151657bd18616514003a2b7b4ebc"} Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.567550 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6cb4276f-72bc-4392-87bb-6a71b0e2233f","Type":"ContainerDied","Data":"635ec92e349a8077672ad5137601620286ac78c372a210d71c8d9429d702ac8e"} Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.567562 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6cb4276f-72bc-4392-87bb-6a71b0e2233f","Type":"ContainerDied","Data":"856bb239db175b1b43aafa525babecb1399db4ad8e36b182a25a1f38c14b3d87"} Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.567580 4775 scope.go:117] "RemoveContainer" containerID="e55f9f8c4db13ad6cf41d4dc33e3ed3f10ac151657bd18616514003a2b7b4ebc" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.636407 4775 scope.go:117] "RemoveContainer" containerID="635ec92e349a8077672ad5137601620286ac78c372a210d71c8d9429d702ac8e" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.640471 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.662326 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.679136 4775 scope.go:117] "RemoveContainer" containerID="e55f9f8c4db13ad6cf41d4dc33e3ed3f10ac151657bd18616514003a2b7b4ebc" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.679212 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:38 crc kubenswrapper[4775]: E1216 15:17:38.679832 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="799c9224-8212-452d-83c5-238ad4a6ed31" containerName="dnsmasq-dns" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.679849 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="799c9224-8212-452d-83c5-238ad4a6ed31" containerName="dnsmasq-dns" Dec 16 15:17:38 crc kubenswrapper[4775]: E1216 15:17:38.679870 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cb4276f-72bc-4392-87bb-6a71b0e2233f" containerName="nova-metadata-log" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.679879 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cb4276f-72bc-4392-87bb-6a71b0e2233f" containerName="nova-metadata-log" Dec 16 15:17:38 crc kubenswrapper[4775]: E1216 15:17:38.679911 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cb4276f-72bc-4392-87bb-6a71b0e2233f" containerName="nova-metadata-metadata" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.679921 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cb4276f-72bc-4392-87bb-6a71b0e2233f" containerName="nova-metadata-metadata" Dec 16 15:17:38 crc kubenswrapper[4775]: E1216 15:17:38.679941 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46586ca7-367e-47d4-bd95-11037f7bb60f" containerName="nova-manage" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.679948 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="46586ca7-367e-47d4-bd95-11037f7bb60f" containerName="nova-manage" Dec 16 15:17:38 crc kubenswrapper[4775]: E1216 15:17:38.679978 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="799c9224-8212-452d-83c5-238ad4a6ed31" containerName="init" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.679991 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="799c9224-8212-452d-83c5-238ad4a6ed31" containerName="init" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.680239 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="799c9224-8212-452d-83c5-238ad4a6ed31" containerName="dnsmasq-dns" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.680270 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="46586ca7-367e-47d4-bd95-11037f7bb60f" containerName="nova-manage" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.680283 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cb4276f-72bc-4392-87bb-6a71b0e2233f" containerName="nova-metadata-metadata" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.680302 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cb4276f-72bc-4392-87bb-6a71b0e2233f" containerName="nova-metadata-log" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.681730 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.690225 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.690416 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.692518 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:38 crc kubenswrapper[4775]: E1216 15:17:38.694320 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e55f9f8c4db13ad6cf41d4dc33e3ed3f10ac151657bd18616514003a2b7b4ebc\": container with ID starting with e55f9f8c4db13ad6cf41d4dc33e3ed3f10ac151657bd18616514003a2b7b4ebc not found: ID does not exist" containerID="e55f9f8c4db13ad6cf41d4dc33e3ed3f10ac151657bd18616514003a2b7b4ebc" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.694382 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e55f9f8c4db13ad6cf41d4dc33e3ed3f10ac151657bd18616514003a2b7b4ebc"} err="failed to get container status \"e55f9f8c4db13ad6cf41d4dc33e3ed3f10ac151657bd18616514003a2b7b4ebc\": rpc error: code = NotFound desc = could not find container \"e55f9f8c4db13ad6cf41d4dc33e3ed3f10ac151657bd18616514003a2b7b4ebc\": container with ID starting with e55f9f8c4db13ad6cf41d4dc33e3ed3f10ac151657bd18616514003a2b7b4ebc not found: ID does not exist" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.694418 4775 scope.go:117] "RemoveContainer" containerID="635ec92e349a8077672ad5137601620286ac78c372a210d71c8d9429d702ac8e" Dec 16 15:17:38 crc kubenswrapper[4775]: E1216 15:17:38.697012 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"635ec92e349a8077672ad5137601620286ac78c372a210d71c8d9429d702ac8e\": container with ID starting with 635ec92e349a8077672ad5137601620286ac78c372a210d71c8d9429d702ac8e not found: ID does not exist" containerID="635ec92e349a8077672ad5137601620286ac78c372a210d71c8d9429d702ac8e" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.697056 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"635ec92e349a8077672ad5137601620286ac78c372a210d71c8d9429d702ac8e"} err="failed to get container status \"635ec92e349a8077672ad5137601620286ac78c372a210d71c8d9429d702ac8e\": rpc error: code = NotFound desc = could not find container \"635ec92e349a8077672ad5137601620286ac78c372a210d71c8d9429d702ac8e\": container with ID starting with 635ec92e349a8077672ad5137601620286ac78c372a210d71c8d9429d702ac8e not found: ID does not exist" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.697085 4775 scope.go:117] "RemoveContainer" containerID="e55f9f8c4db13ad6cf41d4dc33e3ed3f10ac151657bd18616514003a2b7b4ebc" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.699098 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e55f9f8c4db13ad6cf41d4dc33e3ed3f10ac151657bd18616514003a2b7b4ebc"} err="failed to get container status \"e55f9f8c4db13ad6cf41d4dc33e3ed3f10ac151657bd18616514003a2b7b4ebc\": rpc error: code = NotFound desc = could not find container \"e55f9f8c4db13ad6cf41d4dc33e3ed3f10ac151657bd18616514003a2b7b4ebc\": container with ID starting with e55f9f8c4db13ad6cf41d4dc33e3ed3f10ac151657bd18616514003a2b7b4ebc not found: ID does not exist" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.699123 4775 scope.go:117] "RemoveContainer" containerID="635ec92e349a8077672ad5137601620286ac78c372a210d71c8d9429d702ac8e" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.702051 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"635ec92e349a8077672ad5137601620286ac78c372a210d71c8d9429d702ac8e"} err="failed to get container status \"635ec92e349a8077672ad5137601620286ac78c372a210d71c8d9429d702ac8e\": rpc error: code = NotFound desc = could not find container \"635ec92e349a8077672ad5137601620286ac78c372a210d71c8d9429d702ac8e\": container with ID starting with 635ec92e349a8077672ad5137601620286ac78c372a210d71c8d9429d702ac8e not found: ID does not exist" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.851341 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f34a902-d86a-49b7-bd28-d47d1896d0e9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " pod="openstack/nova-metadata-0" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.851414 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f34a902-d86a-49b7-bd28-d47d1896d0e9-logs\") pod \"nova-metadata-0\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " pod="openstack/nova-metadata-0" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.851438 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f34a902-d86a-49b7-bd28-d47d1896d0e9-config-data\") pod \"nova-metadata-0\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " pod="openstack/nova-metadata-0" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.851732 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-624lb\" (UniqueName: \"kubernetes.io/projected/9f34a902-d86a-49b7-bd28-d47d1896d0e9-kube-api-access-624lb\") pod \"nova-metadata-0\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " pod="openstack/nova-metadata-0" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.851843 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f34a902-d86a-49b7-bd28-d47d1896d0e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " pod="openstack/nova-metadata-0" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.953563 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f34a902-d86a-49b7-bd28-d47d1896d0e9-logs\") pod \"nova-metadata-0\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " pod="openstack/nova-metadata-0" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.953638 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f34a902-d86a-49b7-bd28-d47d1896d0e9-config-data\") pod \"nova-metadata-0\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " pod="openstack/nova-metadata-0" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.953863 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-624lb\" (UniqueName: \"kubernetes.io/projected/9f34a902-d86a-49b7-bd28-d47d1896d0e9-kube-api-access-624lb\") pod \"nova-metadata-0\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " pod="openstack/nova-metadata-0" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.954115 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f34a902-d86a-49b7-bd28-d47d1896d0e9-logs\") pod \"nova-metadata-0\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " pod="openstack/nova-metadata-0" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.954366 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f34a902-d86a-49b7-bd28-d47d1896d0e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " pod="openstack/nova-metadata-0" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.955024 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f34a902-d86a-49b7-bd28-d47d1896d0e9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " pod="openstack/nova-metadata-0" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.960910 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f34a902-d86a-49b7-bd28-d47d1896d0e9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " pod="openstack/nova-metadata-0" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.961670 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f34a902-d86a-49b7-bd28-d47d1896d0e9-config-data\") pod \"nova-metadata-0\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " pod="openstack/nova-metadata-0" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.961730 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f34a902-d86a-49b7-bd28-d47d1896d0e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " pod="openstack/nova-metadata-0" Dec 16 15:17:38 crc kubenswrapper[4775]: I1216 15:17:38.972031 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-624lb\" (UniqueName: \"kubernetes.io/projected/9f34a902-d86a-49b7-bd28-d47d1896d0e9-kube-api-access-624lb\") pod \"nova-metadata-0\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " pod="openstack/nova-metadata-0" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.022602 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.068243 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jlnfn" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.161466 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-combined-ca-bundle\") pod \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\" (UID: \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\") " Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.161659 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpqsq\" (UniqueName: \"kubernetes.io/projected/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-kube-api-access-vpqsq\") pod \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\" (UID: \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\") " Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.161796 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-scripts\") pod \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\" (UID: \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\") " Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.161866 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-config-data\") pod \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\" (UID: \"b63d44e7-07c5-48d1-bd00-6e8be2b2c889\") " Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.177322 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-kube-api-access-vpqsq" (OuterVolumeSpecName: "kube-api-access-vpqsq") pod "b63d44e7-07c5-48d1-bd00-6e8be2b2c889" (UID: "b63d44e7-07c5-48d1-bd00-6e8be2b2c889"). InnerVolumeSpecName "kube-api-access-vpqsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.179333 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-scripts" (OuterVolumeSpecName: "scripts") pod "b63d44e7-07c5-48d1-bd00-6e8be2b2c889" (UID: "b63d44e7-07c5-48d1-bd00-6e8be2b2c889"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.264826 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpqsq\" (UniqueName: \"kubernetes.io/projected/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-kube-api-access-vpqsq\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.264863 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.270132 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b63d44e7-07c5-48d1-bd00-6e8be2b2c889" (UID: "b63d44e7-07c5-48d1-bd00-6e8be2b2c889"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.270516 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-config-data" (OuterVolumeSpecName: "config-data") pod "b63d44e7-07c5-48d1-bd00-6e8be2b2c889" (UID: "b63d44e7-07c5-48d1-bd00-6e8be2b2c889"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.353170 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cb4276f-72bc-4392-87bb-6a71b0e2233f" path="/var/lib/kubelet/pods/6cb4276f-72bc-4392-87bb-6a71b0e2233f/volumes" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.366687 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.366725 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63d44e7-07c5-48d1-bd00-6e8be2b2c889-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.534224 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.579010 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f34a902-d86a-49b7-bd28-d47d1896d0e9","Type":"ContainerStarted","Data":"fcd2b21b3b0b53bfb181e2ee348c9fd91d554efe9c52e4b048d021c43eae137a"} Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.581560 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc6d50e-df14-4345-8522-7b3f41d8b956","Type":"ContainerStarted","Data":"9711b307d831e6e8c07ec5073c58b51ec2d250a104f0aaf4b6856e3d4eb68a04"} Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.591157 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jlnfn" event={"ID":"b63d44e7-07c5-48d1-bd00-6e8be2b2c889","Type":"ContainerDied","Data":"d2af24d8623dc8178a096ede4e16fc979af02c939a2e684282452d50315656a7"} Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.591383 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jlnfn" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.591393 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2af24d8623dc8178a096ede4e16fc979af02c939a2e684282452d50315656a7" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.651200 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 15:17:39 crc kubenswrapper[4775]: E1216 15:17:39.651669 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63d44e7-07c5-48d1-bd00-6e8be2b2c889" containerName="nova-cell1-conductor-db-sync" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.651690 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63d44e7-07c5-48d1-bd00-6e8be2b2c889" containerName="nova-cell1-conductor-db-sync" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.651936 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b63d44e7-07c5-48d1-bd00-6e8be2b2c889" containerName="nova-cell1-conductor-db-sync" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.652554 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.654964 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.665365 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.773871 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cxs8\" (UniqueName: \"kubernetes.io/projected/b19dccbe-2434-48ae-8822-1ced3b7167c7-kube-api-access-6cxs8\") pod \"nova-cell1-conductor-0\" (UID: \"b19dccbe-2434-48ae-8822-1ced3b7167c7\") " pod="openstack/nova-cell1-conductor-0" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.774221 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b19dccbe-2434-48ae-8822-1ced3b7167c7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b19dccbe-2434-48ae-8822-1ced3b7167c7\") " pod="openstack/nova-cell1-conductor-0" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.774298 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19dccbe-2434-48ae-8822-1ced3b7167c7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b19dccbe-2434-48ae-8822-1ced3b7167c7\") " pod="openstack/nova-cell1-conductor-0" Dec 16 15:17:39 crc kubenswrapper[4775]: E1216 15:17:39.864586 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a0f0ce0fa15e11e29f3d55172cc152248d772df1a829809b3d72a28466f4495e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 15:17:39 crc kubenswrapper[4775]: E1216 15:17:39.866054 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a0f0ce0fa15e11e29f3d55172cc152248d772df1a829809b3d72a28466f4495e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 15:17:39 crc kubenswrapper[4775]: E1216 15:17:39.867242 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a0f0ce0fa15e11e29f3d55172cc152248d772df1a829809b3d72a28466f4495e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 15:17:39 crc kubenswrapper[4775]: E1216 15:17:39.867283 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="be193a0f-15ac-4a52-b1c3-2174a3ac1864" containerName="nova-scheduler-scheduler" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.876241 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b19dccbe-2434-48ae-8822-1ced3b7167c7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b19dccbe-2434-48ae-8822-1ced3b7167c7\") " pod="openstack/nova-cell1-conductor-0" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.876351 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19dccbe-2434-48ae-8822-1ced3b7167c7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b19dccbe-2434-48ae-8822-1ced3b7167c7\") " pod="openstack/nova-cell1-conductor-0" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.876391 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cxs8\" (UniqueName: \"kubernetes.io/projected/b19dccbe-2434-48ae-8822-1ced3b7167c7-kube-api-access-6cxs8\") pod \"nova-cell1-conductor-0\" (UID: \"b19dccbe-2434-48ae-8822-1ced3b7167c7\") " pod="openstack/nova-cell1-conductor-0" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.884261 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19dccbe-2434-48ae-8822-1ced3b7167c7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b19dccbe-2434-48ae-8822-1ced3b7167c7\") " pod="openstack/nova-cell1-conductor-0" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.884316 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b19dccbe-2434-48ae-8822-1ced3b7167c7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b19dccbe-2434-48ae-8822-1ced3b7167c7\") " pod="openstack/nova-cell1-conductor-0" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.894504 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cxs8\" (UniqueName: \"kubernetes.io/projected/b19dccbe-2434-48ae-8822-1ced3b7167c7-kube-api-access-6cxs8\") pod \"nova-cell1-conductor-0\" (UID: \"b19dccbe-2434-48ae-8822-1ced3b7167c7\") " pod="openstack/nova-cell1-conductor-0" Dec 16 15:17:39 crc kubenswrapper[4775]: I1216 15:17:39.969874 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 15:17:40 crc kubenswrapper[4775]: I1216 15:17:40.603129 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 15:17:40 crc kubenswrapper[4775]: I1216 15:17:40.622786 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f34a902-d86a-49b7-bd28-d47d1896d0e9","Type":"ContainerStarted","Data":"d6ad92d47c32e5c273c39729e5c7947fd4b13cb81e1b6efa014c57f4f7247fd8"} Dec 16 15:17:40 crc kubenswrapper[4775]: I1216 15:17:40.624028 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f34a902-d86a-49b7-bd28-d47d1896d0e9","Type":"ContainerStarted","Data":"8c3117cbae2f0506b8005a0ec68a9aab8b7833ac524ecf3d2c904d3a1eaa39f2"} Dec 16 15:17:40 crc kubenswrapper[4775]: I1216 15:17:40.650372 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.650352998 podStartE2EDuration="2.650352998s" podCreationTimestamp="2025-12-16 15:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:17:40.643275689 +0000 UTC m=+1385.594354612" watchObservedRunningTime="2025-12-16 15:17:40.650352998 +0000 UTC m=+1385.601431911" Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.635420 4775 generic.go:334] "Generic (PLEG): container finished" podID="591cb000-dea7-4b70-a162-211974a0e8a8" containerID="9bcaaec2564cb5ffff7898286e099b3dd73deb6bc52cd3d3dd77958648eb0768" exitCode=0 Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.635511 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"591cb000-dea7-4b70-a162-211974a0e8a8","Type":"ContainerDied","Data":"9bcaaec2564cb5ffff7898286e099b3dd73deb6bc52cd3d3dd77958648eb0768"} Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.635860 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"591cb000-dea7-4b70-a162-211974a0e8a8","Type":"ContainerDied","Data":"9ab2f27b05a0ce57c20cad41d8f9e67a7cc5b071dedf2e01620dd08ca9d64c92"} Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.635907 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ab2f27b05a0ce57c20cad41d8f9e67a7cc5b071dedf2e01620dd08ca9d64c92" Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.639428 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b19dccbe-2434-48ae-8822-1ced3b7167c7","Type":"ContainerStarted","Data":"6d77e2b451df80004b7928db3bde43c5509ffd4e8cf8745fbb0eef3925e49f40"} Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.639469 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b19dccbe-2434-48ae-8822-1ced3b7167c7","Type":"ContainerStarted","Data":"bb971d9e5a8e2cc9d41d203e0dfc42213e06cbd27c807ba78ef57365eb67eed9"} Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.639568 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.642998 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc6d50e-df14-4345-8522-7b3f41d8b956","Type":"ContainerStarted","Data":"5a1c0c85db095fa1e0d5ee0c87112c68565475315a4fbddc94fb2925e3db2d9f"} Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.665037 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.665012815 podStartE2EDuration="2.665012815s" podCreationTimestamp="2025-12-16 15:17:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:17:41.660446708 +0000 UTC m=+1386.611525641" watchObservedRunningTime="2025-12-16 15:17:41.665012815 +0000 UTC m=+1386.616091748" Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.671592 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.709640 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.658602075 podStartE2EDuration="8.709617032s" podCreationTimestamp="2025-12-16 15:17:33 +0000 UTC" firstStartedPulling="2025-12-16 15:17:34.672341831 +0000 UTC m=+1379.623420754" lastFinishedPulling="2025-12-16 15:17:40.723356788 +0000 UTC m=+1385.674435711" observedRunningTime="2025-12-16 15:17:41.683960815 +0000 UTC m=+1386.635039738" watchObservedRunningTime="2025-12-16 15:17:41.709617032 +0000 UTC m=+1386.660695955" Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.716929 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrz4t\" (UniqueName: \"kubernetes.io/projected/591cb000-dea7-4b70-a162-211974a0e8a8-kube-api-access-jrz4t\") pod \"591cb000-dea7-4b70-a162-211974a0e8a8\" (UID: \"591cb000-dea7-4b70-a162-211974a0e8a8\") " Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.717084 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/591cb000-dea7-4b70-a162-211974a0e8a8-logs\") pod \"591cb000-dea7-4b70-a162-211974a0e8a8\" (UID: \"591cb000-dea7-4b70-a162-211974a0e8a8\") " Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.717114 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591cb000-dea7-4b70-a162-211974a0e8a8-config-data\") pod \"591cb000-dea7-4b70-a162-211974a0e8a8\" (UID: \"591cb000-dea7-4b70-a162-211974a0e8a8\") " Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.717151 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591cb000-dea7-4b70-a162-211974a0e8a8-combined-ca-bundle\") pod \"591cb000-dea7-4b70-a162-211974a0e8a8\" (UID: \"591cb000-dea7-4b70-a162-211974a0e8a8\") " Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.718225 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/591cb000-dea7-4b70-a162-211974a0e8a8-logs" (OuterVolumeSpecName: "logs") pod "591cb000-dea7-4b70-a162-211974a0e8a8" (UID: "591cb000-dea7-4b70-a162-211974a0e8a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.724037 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/591cb000-dea7-4b70-a162-211974a0e8a8-kube-api-access-jrz4t" (OuterVolumeSpecName: "kube-api-access-jrz4t") pod "591cb000-dea7-4b70-a162-211974a0e8a8" (UID: "591cb000-dea7-4b70-a162-211974a0e8a8"). InnerVolumeSpecName "kube-api-access-jrz4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.735428 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrz4t\" (UniqueName: \"kubernetes.io/projected/591cb000-dea7-4b70-a162-211974a0e8a8-kube-api-access-jrz4t\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.735464 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/591cb000-dea7-4b70-a162-211974a0e8a8-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.748187 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/591cb000-dea7-4b70-a162-211974a0e8a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "591cb000-dea7-4b70-a162-211974a0e8a8" (UID: "591cb000-dea7-4b70-a162-211974a0e8a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.750629 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/591cb000-dea7-4b70-a162-211974a0e8a8-config-data" (OuterVolumeSpecName: "config-data") pod "591cb000-dea7-4b70-a162-211974a0e8a8" (UID: "591cb000-dea7-4b70-a162-211974a0e8a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.836991 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/591cb000-dea7-4b70-a162-211974a0e8a8-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:41 crc kubenswrapper[4775]: I1216 15:17:41.837028 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591cb000-dea7-4b70-a162-211974a0e8a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.550592 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.650847 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be193a0f-15ac-4a52-b1c3-2174a3ac1864-config-data\") pod \"be193a0f-15ac-4a52-b1c3-2174a3ac1864\" (UID: \"be193a0f-15ac-4a52-b1c3-2174a3ac1864\") " Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.650960 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkqg5\" (UniqueName: \"kubernetes.io/projected/be193a0f-15ac-4a52-b1c3-2174a3ac1864-kube-api-access-kkqg5\") pod \"be193a0f-15ac-4a52-b1c3-2174a3ac1864\" (UID: \"be193a0f-15ac-4a52-b1c3-2174a3ac1864\") " Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.651237 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be193a0f-15ac-4a52-b1c3-2174a3ac1864-combined-ca-bundle\") pod \"be193a0f-15ac-4a52-b1c3-2174a3ac1864\" (UID: \"be193a0f-15ac-4a52-b1c3-2174a3ac1864\") " Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.654787 4775 generic.go:334] "Generic (PLEG): container finished" podID="be193a0f-15ac-4a52-b1c3-2174a3ac1864" containerID="a0f0ce0fa15e11e29f3d55172cc152248d772df1a829809b3d72a28466f4495e" exitCode=0 Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.654841 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.654917 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.654918 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be193a0f-15ac-4a52-b1c3-2174a3ac1864","Type":"ContainerDied","Data":"a0f0ce0fa15e11e29f3d55172cc152248d772df1a829809b3d72a28466f4495e"} Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.654972 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be193a0f-15ac-4a52-b1c3-2174a3ac1864","Type":"ContainerDied","Data":"7b2562bf40534f00e31af0f74a4d1d2c95d19b10f3b321b6d6061b367094de01"} Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.654999 4775 scope.go:117] "RemoveContainer" containerID="a0f0ce0fa15e11e29f3d55172cc152248d772df1a829809b3d72a28466f4495e" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.655482 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.674817 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be193a0f-15ac-4a52-b1c3-2174a3ac1864-kube-api-access-kkqg5" (OuterVolumeSpecName: "kube-api-access-kkqg5") pod "be193a0f-15ac-4a52-b1c3-2174a3ac1864" (UID: "be193a0f-15ac-4a52-b1c3-2174a3ac1864"). InnerVolumeSpecName "kube-api-access-kkqg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.684603 4775 scope.go:117] "RemoveContainer" containerID="a0f0ce0fa15e11e29f3d55172cc152248d772df1a829809b3d72a28466f4495e" Dec 16 15:17:42 crc kubenswrapper[4775]: E1216 15:17:42.686360 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0f0ce0fa15e11e29f3d55172cc152248d772df1a829809b3d72a28466f4495e\": container with ID starting with a0f0ce0fa15e11e29f3d55172cc152248d772df1a829809b3d72a28466f4495e not found: ID does not exist" containerID="a0f0ce0fa15e11e29f3d55172cc152248d772df1a829809b3d72a28466f4495e" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.686430 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0f0ce0fa15e11e29f3d55172cc152248d772df1a829809b3d72a28466f4495e"} err="failed to get container status \"a0f0ce0fa15e11e29f3d55172cc152248d772df1a829809b3d72a28466f4495e\": rpc error: code = NotFound desc = could not find container \"a0f0ce0fa15e11e29f3d55172cc152248d772df1a829809b3d72a28466f4495e\": container with ID starting with a0f0ce0fa15e11e29f3d55172cc152248d772df1a829809b3d72a28466f4495e not found: ID does not exist" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.689399 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be193a0f-15ac-4a52-b1c3-2174a3ac1864-config-data" (OuterVolumeSpecName: "config-data") pod "be193a0f-15ac-4a52-b1c3-2174a3ac1864" (UID: "be193a0f-15ac-4a52-b1c3-2174a3ac1864"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.696543 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be193a0f-15ac-4a52-b1c3-2174a3ac1864-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be193a0f-15ac-4a52-b1c3-2174a3ac1864" (UID: "be193a0f-15ac-4a52-b1c3-2174a3ac1864"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.757582 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be193a0f-15ac-4a52-b1c3-2174a3ac1864-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.757619 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be193a0f-15ac-4a52-b1c3-2174a3ac1864-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.757633 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkqg5\" (UniqueName: \"kubernetes.io/projected/be193a0f-15ac-4a52-b1c3-2174a3ac1864-kube-api-access-kkqg5\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.800653 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.815019 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.838143 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:42 crc kubenswrapper[4775]: E1216 15:17:42.838546 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be193a0f-15ac-4a52-b1c3-2174a3ac1864" containerName="nova-scheduler-scheduler" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.838570 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="be193a0f-15ac-4a52-b1c3-2174a3ac1864" containerName="nova-scheduler-scheduler" Dec 16 15:17:42 crc kubenswrapper[4775]: E1216 15:17:42.838586 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591cb000-dea7-4b70-a162-211974a0e8a8" containerName="nova-api-api" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.838592 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="591cb000-dea7-4b70-a162-211974a0e8a8" containerName="nova-api-api" Dec 16 15:17:42 crc kubenswrapper[4775]: E1216 15:17:42.838619 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591cb000-dea7-4b70-a162-211974a0e8a8" containerName="nova-api-log" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.838624 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="591cb000-dea7-4b70-a162-211974a0e8a8" containerName="nova-api-log" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.838792 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="591cb000-dea7-4b70-a162-211974a0e8a8" containerName="nova-api-log" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.838820 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="591cb000-dea7-4b70-a162-211974a0e8a8" containerName="nova-api-api" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.838843 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="be193a0f-15ac-4a52-b1c3-2174a3ac1864" containerName="nova-scheduler-scheduler" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.839839 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.844109 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.853915 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.964133 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb27686-174d-4795-8859-af77a1964fb5-config-data\") pod \"nova-api-0\" (UID: \"acb27686-174d-4795-8859-af77a1964fb5\") " pod="openstack/nova-api-0" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.964439 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acb27686-174d-4795-8859-af77a1964fb5-logs\") pod \"nova-api-0\" (UID: \"acb27686-174d-4795-8859-af77a1964fb5\") " pod="openstack/nova-api-0" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.964577 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2swks\" (UniqueName: \"kubernetes.io/projected/acb27686-174d-4795-8859-af77a1964fb5-kube-api-access-2swks\") pod \"nova-api-0\" (UID: \"acb27686-174d-4795-8859-af77a1964fb5\") " pod="openstack/nova-api-0" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.964832 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb27686-174d-4795-8859-af77a1964fb5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"acb27686-174d-4795-8859-af77a1964fb5\") " pod="openstack/nova-api-0" Dec 16 15:17:42 crc kubenswrapper[4775]: I1216 15:17:42.991388 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.003120 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.017204 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.018970 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.022617 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.032139 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.066102 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb27686-174d-4795-8859-af77a1964fb5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"acb27686-174d-4795-8859-af77a1964fb5\") " pod="openstack/nova-api-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.066448 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwx7z\" (UniqueName: \"kubernetes.io/projected/66a41c08-de0e-46c7-ae0c-b56ba2544af5-kube-api-access-jwx7z\") pod \"nova-scheduler-0\" (UID: \"66a41c08-de0e-46c7-ae0c-b56ba2544af5\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.066677 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb27686-174d-4795-8859-af77a1964fb5-config-data\") pod \"nova-api-0\" (UID: \"acb27686-174d-4795-8859-af77a1964fb5\") " pod="openstack/nova-api-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.066819 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acb27686-174d-4795-8859-af77a1964fb5-logs\") pod \"nova-api-0\" (UID: \"acb27686-174d-4795-8859-af77a1964fb5\") " pod="openstack/nova-api-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.066963 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2swks\" (UniqueName: \"kubernetes.io/projected/acb27686-174d-4795-8859-af77a1964fb5-kube-api-access-2swks\") pod \"nova-api-0\" (UID: \"acb27686-174d-4795-8859-af77a1964fb5\") " pod="openstack/nova-api-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.067090 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66a41c08-de0e-46c7-ae0c-b56ba2544af5-config-data\") pod \"nova-scheduler-0\" (UID: \"66a41c08-de0e-46c7-ae0c-b56ba2544af5\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.067223 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acb27686-174d-4795-8859-af77a1964fb5-logs\") pod \"nova-api-0\" (UID: \"acb27686-174d-4795-8859-af77a1964fb5\") " pod="openstack/nova-api-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.067362 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a41c08-de0e-46c7-ae0c-b56ba2544af5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"66a41c08-de0e-46c7-ae0c-b56ba2544af5\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.070705 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb27686-174d-4795-8859-af77a1964fb5-config-data\") pod \"nova-api-0\" (UID: \"acb27686-174d-4795-8859-af77a1964fb5\") " pod="openstack/nova-api-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.070954 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb27686-174d-4795-8859-af77a1964fb5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"acb27686-174d-4795-8859-af77a1964fb5\") " pod="openstack/nova-api-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.094810 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2swks\" (UniqueName: \"kubernetes.io/projected/acb27686-174d-4795-8859-af77a1964fb5-kube-api-access-2swks\") pod \"nova-api-0\" (UID: \"acb27686-174d-4795-8859-af77a1964fb5\") " pod="openstack/nova-api-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.157612 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.169113 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a41c08-de0e-46c7-ae0c-b56ba2544af5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"66a41c08-de0e-46c7-ae0c-b56ba2544af5\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.169523 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwx7z\" (UniqueName: \"kubernetes.io/projected/66a41c08-de0e-46c7-ae0c-b56ba2544af5-kube-api-access-jwx7z\") pod \"nova-scheduler-0\" (UID: \"66a41c08-de0e-46c7-ae0c-b56ba2544af5\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.169600 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66a41c08-de0e-46c7-ae0c-b56ba2544af5-config-data\") pod \"nova-scheduler-0\" (UID: \"66a41c08-de0e-46c7-ae0c-b56ba2544af5\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.172934 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66a41c08-de0e-46c7-ae0c-b56ba2544af5-config-data\") pod \"nova-scheduler-0\" (UID: \"66a41c08-de0e-46c7-ae0c-b56ba2544af5\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.173391 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a41c08-de0e-46c7-ae0c-b56ba2544af5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"66a41c08-de0e-46c7-ae0c-b56ba2544af5\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.199712 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwx7z\" (UniqueName: \"kubernetes.io/projected/66a41c08-de0e-46c7-ae0c-b56ba2544af5-kube-api-access-jwx7z\") pod \"nova-scheduler-0\" (UID: \"66a41c08-de0e-46c7-ae0c-b56ba2544af5\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.341737 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.350121 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="591cb000-dea7-4b70-a162-211974a0e8a8" path="/var/lib/kubelet/pods/591cb000-dea7-4b70-a162-211974a0e8a8/volumes" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.350700 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be193a0f-15ac-4a52-b1c3-2174a3ac1864" path="/var/lib/kubelet/pods/be193a0f-15ac-4a52-b1c3-2174a3ac1864/volumes" Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.669424 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:43 crc kubenswrapper[4775]: I1216 15:17:43.837988 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:17:44 crc kubenswrapper[4775]: I1216 15:17:44.023273 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 15:17:44 crc kubenswrapper[4775]: I1216 15:17:44.023601 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 15:17:44 crc kubenswrapper[4775]: I1216 15:17:44.677955 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"66a41c08-de0e-46c7-ae0c-b56ba2544af5","Type":"ContainerStarted","Data":"a57cffbc0e696fc424fe650fd406d8400529e0266328256e2fa086f085050576"} Dec 16 15:17:44 crc kubenswrapper[4775]: I1216 15:17:44.679559 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"66a41c08-de0e-46c7-ae0c-b56ba2544af5","Type":"ContainerStarted","Data":"e2ff7cda78fbcc0cd39d4966e8d4a8170f2de2952d5011e37805ebd7b5dc7ea7"} Dec 16 15:17:44 crc kubenswrapper[4775]: I1216 15:17:44.684043 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"acb27686-174d-4795-8859-af77a1964fb5","Type":"ContainerStarted","Data":"ddb2b9ef0d187c77d3a151851198409019ec187ee84dd8433587adfa1a980ece"} Dec 16 15:17:44 crc kubenswrapper[4775]: I1216 15:17:44.684086 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"acb27686-174d-4795-8859-af77a1964fb5","Type":"ContainerStarted","Data":"8be6db6ccb0d882cb2cc26581e3cfaf9253da95ec5178fc48f9b138d3ad28ebb"} Dec 16 15:17:44 crc kubenswrapper[4775]: I1216 15:17:44.684106 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"acb27686-174d-4795-8859-af77a1964fb5","Type":"ContainerStarted","Data":"f21579d7a2ca97798b8b2a071531470fb27dc1db52d750485c54f5af31a0b601"} Dec 16 15:17:44 crc kubenswrapper[4775]: I1216 15:17:44.703770 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.703746401 podStartE2EDuration="2.703746401s" podCreationTimestamp="2025-12-16 15:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:17:44.700153615 +0000 UTC m=+1389.651232558" watchObservedRunningTime="2025-12-16 15:17:44.703746401 +0000 UTC m=+1389.654825324" Dec 16 15:17:44 crc kubenswrapper[4775]: I1216 15:17:44.718923 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.7189025879999997 podStartE2EDuration="2.718902588s" podCreationTimestamp="2025-12-16 15:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:17:44.718673891 +0000 UTC m=+1389.669752844" watchObservedRunningTime="2025-12-16 15:17:44.718902588 +0000 UTC m=+1389.669981511" Dec 16 15:17:48 crc kubenswrapper[4775]: I1216 15:17:48.342983 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 15:17:49 crc kubenswrapper[4775]: I1216 15:17:49.023729 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 15:17:49 crc kubenswrapper[4775]: I1216 15:17:49.024035 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 15:17:49 crc kubenswrapper[4775]: I1216 15:17:49.996528 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 16 15:17:50 crc kubenswrapper[4775]: I1216 15:17:50.038282 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9f34a902-d86a-49b7-bd28-d47d1896d0e9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 15:17:50 crc kubenswrapper[4775]: I1216 15:17:50.038282 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9f34a902-d86a-49b7-bd28-d47d1896d0e9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 15:17:53 crc kubenswrapper[4775]: I1216 15:17:53.159013 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 15:17:53 crc kubenswrapper[4775]: I1216 15:17:53.159464 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 15:17:53 crc kubenswrapper[4775]: I1216 15:17:53.351057 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 15:17:53 crc kubenswrapper[4775]: I1216 15:17:53.379384 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 15:17:53 crc kubenswrapper[4775]: I1216 15:17:53.805357 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 15:17:54 crc kubenswrapper[4775]: I1216 15:17:54.240222 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="acb27686-174d-4795-8859-af77a1964fb5" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 15:17:54 crc kubenswrapper[4775]: I1216 15:17:54.240172 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="acb27686-174d-4795-8859-af77a1964fb5" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 15:17:59 crc kubenswrapper[4775]: I1216 15:17:59.030652 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 15:17:59 crc kubenswrapper[4775]: I1216 15:17:59.033131 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 15:17:59 crc kubenswrapper[4775]: I1216 15:17:59.036453 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 15:17:59 crc kubenswrapper[4775]: I1216 15:17:59.833626 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 15:18:01 crc kubenswrapper[4775]: E1216 15:18:01.689109 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93c495eb_7d30_4db8_8c2b_47410834e889.slice/crio-b7b68e3a0262a8a8082c4920ba9713c9becf917dec8a90cb506ae627478b6193.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93c495eb_7d30_4db8_8c2b_47410834e889.slice/crio-conmon-b7b68e3a0262a8a8082c4920ba9713c9becf917dec8a90cb506ae627478b6193.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe193a0f_15ac_4a52_b1c3_2174a3ac1864.slice/crio-7b2562bf40534f00e31af0f74a4d1d2c95d19b10f3b321b6d6061b367094de01\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe193a0f_15ac_4a52_b1c3_2174a3ac1864.slice\": RecentStats: unable to find data in memory cache]" Dec 16 15:18:01 crc kubenswrapper[4775]: I1216 15:18:01.844860 4775 generic.go:334] "Generic (PLEG): container finished" podID="93c495eb-7d30-4db8-8c2b-47410834e889" containerID="b7b68e3a0262a8a8082c4920ba9713c9becf917dec8a90cb506ae627478b6193" exitCode=137 Dec 16 15:18:01 crc kubenswrapper[4775]: I1216 15:18:01.844931 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"93c495eb-7d30-4db8-8c2b-47410834e889","Type":"ContainerDied","Data":"b7b68e3a0262a8a8082c4920ba9713c9becf917dec8a90cb506ae627478b6193"} Dec 16 15:18:01 crc kubenswrapper[4775]: I1216 15:18:01.845415 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"93c495eb-7d30-4db8-8c2b-47410834e889","Type":"ContainerDied","Data":"11f2da6b343233b2f148600c572aaaafa2917a0acc06008279c9a0ed93199881"} Dec 16 15:18:01 crc kubenswrapper[4775]: I1216 15:18:01.845441 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11f2da6b343233b2f148600c572aaaafa2917a0acc06008279c9a0ed93199881" Dec 16 15:18:01 crc kubenswrapper[4775]: I1216 15:18:01.853480 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:01 crc kubenswrapper[4775]: I1216 15:18:01.987567 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93c495eb-7d30-4db8-8c2b-47410834e889-config-data\") pod \"93c495eb-7d30-4db8-8c2b-47410834e889\" (UID: \"93c495eb-7d30-4db8-8c2b-47410834e889\") " Dec 16 15:18:01 crc kubenswrapper[4775]: I1216 15:18:01.987653 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsh8b\" (UniqueName: \"kubernetes.io/projected/93c495eb-7d30-4db8-8c2b-47410834e889-kube-api-access-qsh8b\") pod \"93c495eb-7d30-4db8-8c2b-47410834e889\" (UID: \"93c495eb-7d30-4db8-8c2b-47410834e889\") " Dec 16 15:18:01 crc kubenswrapper[4775]: I1216 15:18:01.987688 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93c495eb-7d30-4db8-8c2b-47410834e889-combined-ca-bundle\") pod \"93c495eb-7d30-4db8-8c2b-47410834e889\" (UID: \"93c495eb-7d30-4db8-8c2b-47410834e889\") " Dec 16 15:18:01 crc kubenswrapper[4775]: I1216 15:18:01.993453 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93c495eb-7d30-4db8-8c2b-47410834e889-kube-api-access-qsh8b" (OuterVolumeSpecName: "kube-api-access-qsh8b") pod "93c495eb-7d30-4db8-8c2b-47410834e889" (UID: "93c495eb-7d30-4db8-8c2b-47410834e889"). InnerVolumeSpecName "kube-api-access-qsh8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:02 crc kubenswrapper[4775]: I1216 15:18:02.016693 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c495eb-7d30-4db8-8c2b-47410834e889-config-data" (OuterVolumeSpecName: "config-data") pod "93c495eb-7d30-4db8-8c2b-47410834e889" (UID: "93c495eb-7d30-4db8-8c2b-47410834e889"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:02 crc kubenswrapper[4775]: I1216 15:18:02.018773 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c495eb-7d30-4db8-8c2b-47410834e889-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93c495eb-7d30-4db8-8c2b-47410834e889" (UID: "93c495eb-7d30-4db8-8c2b-47410834e889"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:02 crc kubenswrapper[4775]: I1216 15:18:02.090785 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93c495eb-7d30-4db8-8c2b-47410834e889-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:02 crc kubenswrapper[4775]: I1216 15:18:02.091038 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsh8b\" (UniqueName: \"kubernetes.io/projected/93c495eb-7d30-4db8-8c2b-47410834e889-kube-api-access-qsh8b\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:02 crc kubenswrapper[4775]: I1216 15:18:02.091167 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93c495eb-7d30-4db8-8c2b-47410834e889-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:02 crc kubenswrapper[4775]: I1216 15:18:02.855814 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:02 crc kubenswrapper[4775]: I1216 15:18:02.869473 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:18:02 crc kubenswrapper[4775]: I1216 15:18:02.869944 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:18:02 crc kubenswrapper[4775]: I1216 15:18:02.896270 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 15:18:02 crc kubenswrapper[4775]: I1216 15:18:02.907532 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 15:18:02 crc kubenswrapper[4775]: I1216 15:18:02.947875 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 15:18:02 crc kubenswrapper[4775]: E1216 15:18:02.948436 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93c495eb-7d30-4db8-8c2b-47410834e889" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 15:18:02 crc kubenswrapper[4775]: I1216 15:18:02.948461 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c495eb-7d30-4db8-8c2b-47410834e889" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 15:18:02 crc kubenswrapper[4775]: I1216 15:18:02.948722 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="93c495eb-7d30-4db8-8c2b-47410834e889" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 15:18:02 crc kubenswrapper[4775]: I1216 15:18:02.949565 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:02 crc kubenswrapper[4775]: I1216 15:18:02.952266 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 16 15:18:02 crc kubenswrapper[4775]: I1216 15:18:02.952516 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 16 15:18:02 crc kubenswrapper[4775]: I1216 15:18:02.952671 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 16 15:18:02 crc kubenswrapper[4775]: I1216 15:18:02.959225 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.011302 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2451285e-33a6-42ca-b8f9-336131211c7b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2451285e-33a6-42ca-b8f9-336131211c7b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.011391 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2451285e-33a6-42ca-b8f9-336131211c7b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2451285e-33a6-42ca-b8f9-336131211c7b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.011507 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdchp\" (UniqueName: \"kubernetes.io/projected/2451285e-33a6-42ca-b8f9-336131211c7b-kube-api-access-rdchp\") pod \"nova-cell1-novncproxy-0\" (UID: \"2451285e-33a6-42ca-b8f9-336131211c7b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.011631 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2451285e-33a6-42ca-b8f9-336131211c7b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2451285e-33a6-42ca-b8f9-336131211c7b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.011706 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2451285e-33a6-42ca-b8f9-336131211c7b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2451285e-33a6-42ca-b8f9-336131211c7b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.112615 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdchp\" (UniqueName: \"kubernetes.io/projected/2451285e-33a6-42ca-b8f9-336131211c7b-kube-api-access-rdchp\") pod \"nova-cell1-novncproxy-0\" (UID: \"2451285e-33a6-42ca-b8f9-336131211c7b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.112730 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2451285e-33a6-42ca-b8f9-336131211c7b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2451285e-33a6-42ca-b8f9-336131211c7b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.112785 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2451285e-33a6-42ca-b8f9-336131211c7b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2451285e-33a6-42ca-b8f9-336131211c7b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.112907 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2451285e-33a6-42ca-b8f9-336131211c7b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2451285e-33a6-42ca-b8f9-336131211c7b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.112951 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2451285e-33a6-42ca-b8f9-336131211c7b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2451285e-33a6-42ca-b8f9-336131211c7b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.118294 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2451285e-33a6-42ca-b8f9-336131211c7b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2451285e-33a6-42ca-b8f9-336131211c7b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.118744 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2451285e-33a6-42ca-b8f9-336131211c7b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2451285e-33a6-42ca-b8f9-336131211c7b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.120492 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2451285e-33a6-42ca-b8f9-336131211c7b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2451285e-33a6-42ca-b8f9-336131211c7b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.121856 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2451285e-33a6-42ca-b8f9-336131211c7b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2451285e-33a6-42ca-b8f9-336131211c7b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.134706 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdchp\" (UniqueName: \"kubernetes.io/projected/2451285e-33a6-42ca-b8f9-336131211c7b-kube-api-access-rdchp\") pod \"nova-cell1-novncproxy-0\" (UID: \"2451285e-33a6-42ca-b8f9-336131211c7b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.161809 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.162788 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.165143 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.171137 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.279928 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.356415 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93c495eb-7d30-4db8-8c2b-47410834e889" path="/var/lib/kubelet/pods/93c495eb-7d30-4db8-8c2b-47410834e889/volumes" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.747144 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 15:18:03 crc kubenswrapper[4775]: W1216 15:18:03.751908 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2451285e_33a6_42ca_b8f9_336131211c7b.slice/crio-d04513c49da6957dbfa16997f120d1074abbc2d5c4d4fc3cffc94d713bdec7eb WatchSource:0}: Error finding container d04513c49da6957dbfa16997f120d1074abbc2d5c4d4fc3cffc94d713bdec7eb: Status 404 returned error can't find the container with id d04513c49da6957dbfa16997f120d1074abbc2d5c4d4fc3cffc94d713bdec7eb Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.868201 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2451285e-33a6-42ca-b8f9-336131211c7b","Type":"ContainerStarted","Data":"d04513c49da6957dbfa16997f120d1074abbc2d5c4d4fc3cffc94d713bdec7eb"} Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.868450 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 15:18:03 crc kubenswrapper[4775]: I1216 15:18:03.872906 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.113480 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-69b7h"] Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.115661 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.134742 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-69b7h\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.134792 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6sst\" (UniqueName: \"kubernetes.io/projected/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-kube-api-access-p6sst\") pod \"dnsmasq-dns-6b7bbf7cf9-69b7h\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.134837 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-69b7h\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.134867 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-69b7h\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.134994 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-config\") pod \"dnsmasq-dns-6b7bbf7cf9-69b7h\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.135018 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-69b7h\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.147148 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-69b7h"] Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.242057 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6sst\" (UniqueName: \"kubernetes.io/projected/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-kube-api-access-p6sst\") pod \"dnsmasq-dns-6b7bbf7cf9-69b7h\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.242147 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-69b7h\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.242186 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-69b7h\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.242292 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-config\") pod \"dnsmasq-dns-6b7bbf7cf9-69b7h\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.242314 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-69b7h\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.242425 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-69b7h\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.243601 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-69b7h\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.243976 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-config\") pod \"dnsmasq-dns-6b7bbf7cf9-69b7h\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.246679 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-69b7h\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.247388 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-69b7h\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.250567 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-69b7h\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.271138 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6sst\" (UniqueName: \"kubernetes.io/projected/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-kube-api-access-p6sst\") pod \"dnsmasq-dns-6b7bbf7cf9-69b7h\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.345710 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.515233 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.882508 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2451285e-33a6-42ca-b8f9-336131211c7b","Type":"ContainerStarted","Data":"5c8802495eacb124f60627edc4bab93ef6c222f36523c163094860b39bc217b0"} Dec 16 15:18:04 crc kubenswrapper[4775]: I1216 15:18:04.909127 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.909104176 podStartE2EDuration="2.909104176s" podCreationTimestamp="2025-12-16 15:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:18:04.907622829 +0000 UTC m=+1409.858701772" watchObservedRunningTime="2025-12-16 15:18:04.909104176 +0000 UTC m=+1409.860183099" Dec 16 15:18:05 crc kubenswrapper[4775]: I1216 15:18:05.000315 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-69b7h"] Dec 16 15:18:05 crc kubenswrapper[4775]: W1216 15:18:05.009070 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b8b1fb4_1c9e_4920_aa5e_767437ef455f.slice/crio-b53fbe9d9ac25c1a1ffcdf4da293e5ee9fc6659459513827252c2dd877a68706 WatchSource:0}: Error finding container b53fbe9d9ac25c1a1ffcdf4da293e5ee9fc6659459513827252c2dd877a68706: Status 404 returned error can't find the container with id b53fbe9d9ac25c1a1ffcdf4da293e5ee9fc6659459513827252c2dd877a68706 Dec 16 15:18:05 crc kubenswrapper[4775]: I1216 15:18:05.892089 4775 generic.go:334] "Generic (PLEG): container finished" podID="1b8b1fb4-1c9e-4920-aa5e-767437ef455f" containerID="2e199d5870654836c2e3c777abea7c81f9f4e1a00a91978fdf2a0c4dbcaf1740" exitCode=0 Dec 16 15:18:05 crc kubenswrapper[4775]: I1216 15:18:05.892150 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" event={"ID":"1b8b1fb4-1c9e-4920-aa5e-767437ef455f","Type":"ContainerDied","Data":"2e199d5870654836c2e3c777abea7c81f9f4e1a00a91978fdf2a0c4dbcaf1740"} Dec 16 15:18:05 crc kubenswrapper[4775]: I1216 15:18:05.892407 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" event={"ID":"1b8b1fb4-1c9e-4920-aa5e-767437ef455f","Type":"ContainerStarted","Data":"b53fbe9d9ac25c1a1ffcdf4da293e5ee9fc6659459513827252c2dd877a68706"} Dec 16 15:18:06 crc kubenswrapper[4775]: I1216 15:18:06.364215 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:18:06 crc kubenswrapper[4775]: I1216 15:18:06.364752 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerName="ceilometer-central-agent" containerID="cri-o://abd091ea75c441b1b32f3067d5db38cc99820ebfe68250bc5f3dfd39608c940d" gracePeriod=30 Dec 16 15:18:06 crc kubenswrapper[4775]: I1216 15:18:06.365036 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerName="proxy-httpd" containerID="cri-o://5a1c0c85db095fa1e0d5ee0c87112c68565475315a4fbddc94fb2925e3db2d9f" gracePeriod=30 Dec 16 15:18:06 crc kubenswrapper[4775]: I1216 15:18:06.365178 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerName="ceilometer-notification-agent" containerID="cri-o://3b0eeed3eb498d88f58a831fd428e32689aed36255274f4595928f2573627371" gracePeriod=30 Dec 16 15:18:06 crc kubenswrapper[4775]: I1216 15:18:06.365236 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerName="sg-core" containerID="cri-o://9711b307d831e6e8c07ec5073c58b51ec2d250a104f0aaf4b6856e3d4eb68a04" gracePeriod=30 Dec 16 15:18:06 crc kubenswrapper[4775]: I1216 15:18:06.609536 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:18:06 crc kubenswrapper[4775]: I1216 15:18:06.904823 4775 generic.go:334] "Generic (PLEG): container finished" podID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerID="5a1c0c85db095fa1e0d5ee0c87112c68565475315a4fbddc94fb2925e3db2d9f" exitCode=0 Dec 16 15:18:06 crc kubenswrapper[4775]: I1216 15:18:06.905169 4775 generic.go:334] "Generic (PLEG): container finished" podID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerID="9711b307d831e6e8c07ec5073c58b51ec2d250a104f0aaf4b6856e3d4eb68a04" exitCode=2 Dec 16 15:18:06 crc kubenswrapper[4775]: I1216 15:18:06.905185 4775 generic.go:334] "Generic (PLEG): container finished" podID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerID="abd091ea75c441b1b32f3067d5db38cc99820ebfe68250bc5f3dfd39608c940d" exitCode=0 Dec 16 15:18:06 crc kubenswrapper[4775]: I1216 15:18:06.905240 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc6d50e-df14-4345-8522-7b3f41d8b956","Type":"ContainerDied","Data":"5a1c0c85db095fa1e0d5ee0c87112c68565475315a4fbddc94fb2925e3db2d9f"} Dec 16 15:18:06 crc kubenswrapper[4775]: I1216 15:18:06.905273 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc6d50e-df14-4345-8522-7b3f41d8b956","Type":"ContainerDied","Data":"9711b307d831e6e8c07ec5073c58b51ec2d250a104f0aaf4b6856e3d4eb68a04"} Dec 16 15:18:06 crc kubenswrapper[4775]: I1216 15:18:06.905288 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc6d50e-df14-4345-8522-7b3f41d8b956","Type":"ContainerDied","Data":"abd091ea75c441b1b32f3067d5db38cc99820ebfe68250bc5f3dfd39608c940d"} Dec 16 15:18:06 crc kubenswrapper[4775]: I1216 15:18:06.909613 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="acb27686-174d-4795-8859-af77a1964fb5" containerName="nova-api-log" containerID="cri-o://8be6db6ccb0d882cb2cc26581e3cfaf9253da95ec5178fc48f9b138d3ad28ebb" gracePeriod=30 Dec 16 15:18:06 crc kubenswrapper[4775]: I1216 15:18:06.910687 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" event={"ID":"1b8b1fb4-1c9e-4920-aa5e-767437ef455f","Type":"ContainerStarted","Data":"fb5ee18e6c468bbafb0421247f690fe08a9100a09726d0f904a05a89279b3712"} Dec 16 15:18:06 crc kubenswrapper[4775]: I1216 15:18:06.910728 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:06 crc kubenswrapper[4775]: I1216 15:18:06.911113 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="acb27686-174d-4795-8859-af77a1964fb5" containerName="nova-api-api" containerID="cri-o://ddb2b9ef0d187c77d3a151851198409019ec187ee84dd8433587adfa1a980ece" gracePeriod=30 Dec 16 15:18:06 crc kubenswrapper[4775]: I1216 15:18:06.950731 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" podStartSLOduration=2.950714019 podStartE2EDuration="2.950714019s" podCreationTimestamp="2025-12-16 15:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:18:06.946878576 +0000 UTC m=+1411.897957529" watchObservedRunningTime="2025-12-16 15:18:06.950714019 +0000 UTC m=+1411.901792942" Dec 16 15:18:07 crc kubenswrapper[4775]: I1216 15:18:07.922406 4775 generic.go:334] "Generic (PLEG): container finished" podID="acb27686-174d-4795-8859-af77a1964fb5" containerID="8be6db6ccb0d882cb2cc26581e3cfaf9253da95ec5178fc48f9b138d3ad28ebb" exitCode=143 Dec 16 15:18:07 crc kubenswrapper[4775]: I1216 15:18:07.922500 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"acb27686-174d-4795-8859-af77a1964fb5","Type":"ContainerDied","Data":"8be6db6ccb0d882cb2cc26581e3cfaf9253da95ec5178fc48f9b138d3ad28ebb"} Dec 16 15:18:08 crc kubenswrapper[4775]: I1216 15:18:08.280214 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.011075 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.011697 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="06cbbbf9-ba64-4343-bde4-61db8b81e2d8" containerName="kube-state-metrics" containerID="cri-o://eeeb3bf4d0da24759531d7eab23f781a4c354731192bcc208b01ead87fd9cf8c" gracePeriod=30 Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.389586 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.585237 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc6d50e-df14-4345-8522-7b3f41d8b956-run-httpd\") pod \"5dc6d50e-df14-4345-8522-7b3f41d8b956\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.585317 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-config-data\") pod \"5dc6d50e-df14-4345-8522-7b3f41d8b956\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.585361 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-scripts\") pod \"5dc6d50e-df14-4345-8522-7b3f41d8b956\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.585520 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-799dw\" (UniqueName: \"kubernetes.io/projected/5dc6d50e-df14-4345-8522-7b3f41d8b956-kube-api-access-799dw\") pod \"5dc6d50e-df14-4345-8522-7b3f41d8b956\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.585585 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-combined-ca-bundle\") pod \"5dc6d50e-df14-4345-8522-7b3f41d8b956\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.585616 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc6d50e-df14-4345-8522-7b3f41d8b956-log-httpd\") pod \"5dc6d50e-df14-4345-8522-7b3f41d8b956\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.585713 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-sg-core-conf-yaml\") pod \"5dc6d50e-df14-4345-8522-7b3f41d8b956\" (UID: \"5dc6d50e-df14-4345-8522-7b3f41d8b956\") " Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.587239 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dc6d50e-df14-4345-8522-7b3f41d8b956-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5dc6d50e-df14-4345-8522-7b3f41d8b956" (UID: "5dc6d50e-df14-4345-8522-7b3f41d8b956"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.587352 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc6d50e-df14-4345-8522-7b3f41d8b956-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.587412 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dc6d50e-df14-4345-8522-7b3f41d8b956-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5dc6d50e-df14-4345-8522-7b3f41d8b956" (UID: "5dc6d50e-df14-4345-8522-7b3f41d8b956"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.592403 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dc6d50e-df14-4345-8522-7b3f41d8b956-kube-api-access-799dw" (OuterVolumeSpecName: "kube-api-access-799dw") pod "5dc6d50e-df14-4345-8522-7b3f41d8b956" (UID: "5dc6d50e-df14-4345-8522-7b3f41d8b956"). InnerVolumeSpecName "kube-api-access-799dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.592990 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-scripts" (OuterVolumeSpecName: "scripts") pod "5dc6d50e-df14-4345-8522-7b3f41d8b956" (UID: "5dc6d50e-df14-4345-8522-7b3f41d8b956"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.610773 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.618141 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.621171 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5dc6d50e-df14-4345-8522-7b3f41d8b956" (UID: "5dc6d50e-df14-4345-8522-7b3f41d8b956"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.696740 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acb27686-174d-4795-8859-af77a1964fb5-logs\") pod \"acb27686-174d-4795-8859-af77a1964fb5\" (UID: \"acb27686-174d-4795-8859-af77a1964fb5\") " Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.699290 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb27686-174d-4795-8859-af77a1964fb5-config-data\") pod \"acb27686-174d-4795-8859-af77a1964fb5\" (UID: \"acb27686-174d-4795-8859-af77a1964fb5\") " Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.699345 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb27686-174d-4795-8859-af77a1964fb5-combined-ca-bundle\") pod \"acb27686-174d-4795-8859-af77a1964fb5\" (UID: \"acb27686-174d-4795-8859-af77a1964fb5\") " Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.700270 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc6d50e-df14-4345-8522-7b3f41d8b956-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.700299 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.700311 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-799dw\" (UniqueName: \"kubernetes.io/projected/5dc6d50e-df14-4345-8522-7b3f41d8b956-kube-api-access-799dw\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.700325 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.697306 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acb27686-174d-4795-8859-af77a1964fb5-logs" (OuterVolumeSpecName: "logs") pod "acb27686-174d-4795-8859-af77a1964fb5" (UID: "acb27686-174d-4795-8859-af77a1964fb5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.719011 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dc6d50e-df14-4345-8522-7b3f41d8b956" (UID: "5dc6d50e-df14-4345-8522-7b3f41d8b956"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.729038 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb27686-174d-4795-8859-af77a1964fb5-config-data" (OuterVolumeSpecName: "config-data") pod "acb27686-174d-4795-8859-af77a1964fb5" (UID: "acb27686-174d-4795-8859-af77a1964fb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.742614 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-config-data" (OuterVolumeSpecName: "config-data") pod "5dc6d50e-df14-4345-8522-7b3f41d8b956" (UID: "5dc6d50e-df14-4345-8522-7b3f41d8b956"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.743208 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb27686-174d-4795-8859-af77a1964fb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acb27686-174d-4795-8859-af77a1964fb5" (UID: "acb27686-174d-4795-8859-af77a1964fb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.801789 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2swks\" (UniqueName: \"kubernetes.io/projected/acb27686-174d-4795-8859-af77a1964fb5-kube-api-access-2swks\") pod \"acb27686-174d-4795-8859-af77a1964fb5\" (UID: \"acb27686-174d-4795-8859-af77a1964fb5\") " Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.801907 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmdft\" (UniqueName: \"kubernetes.io/projected/06cbbbf9-ba64-4343-bde4-61db8b81e2d8-kube-api-access-jmdft\") pod \"06cbbbf9-ba64-4343-bde4-61db8b81e2d8\" (UID: \"06cbbbf9-ba64-4343-bde4-61db8b81e2d8\") " Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.802483 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.802500 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb27686-174d-4795-8859-af77a1964fb5-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.802513 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb27686-174d-4795-8859-af77a1964fb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.802523 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc6d50e-df14-4345-8522-7b3f41d8b956-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.802534 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acb27686-174d-4795-8859-af77a1964fb5-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.805302 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb27686-174d-4795-8859-af77a1964fb5-kube-api-access-2swks" (OuterVolumeSpecName: "kube-api-access-2swks") pod "acb27686-174d-4795-8859-af77a1964fb5" (UID: "acb27686-174d-4795-8859-af77a1964fb5"). InnerVolumeSpecName "kube-api-access-2swks". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.806213 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06cbbbf9-ba64-4343-bde4-61db8b81e2d8-kube-api-access-jmdft" (OuterVolumeSpecName: "kube-api-access-jmdft") pod "06cbbbf9-ba64-4343-bde4-61db8b81e2d8" (UID: "06cbbbf9-ba64-4343-bde4-61db8b81e2d8"). InnerVolumeSpecName "kube-api-access-jmdft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.903432 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2swks\" (UniqueName: \"kubernetes.io/projected/acb27686-174d-4795-8859-af77a1964fb5-kube-api-access-2swks\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.903472 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmdft\" (UniqueName: \"kubernetes.io/projected/06cbbbf9-ba64-4343-bde4-61db8b81e2d8-kube-api-access-jmdft\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.950244 4775 generic.go:334] "Generic (PLEG): container finished" podID="acb27686-174d-4795-8859-af77a1964fb5" containerID="ddb2b9ef0d187c77d3a151851198409019ec187ee84dd8433587adfa1a980ece" exitCode=0 Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.950306 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"acb27686-174d-4795-8859-af77a1964fb5","Type":"ContainerDied","Data":"ddb2b9ef0d187c77d3a151851198409019ec187ee84dd8433587adfa1a980ece"} Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.950336 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"acb27686-174d-4795-8859-af77a1964fb5","Type":"ContainerDied","Data":"f21579d7a2ca97798b8b2a071531470fb27dc1db52d750485c54f5af31a0b601"} Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.950354 4775 scope.go:117] "RemoveContainer" containerID="ddb2b9ef0d187c77d3a151851198409019ec187ee84dd8433587adfa1a980ece" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.950492 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.961762 4775 generic.go:334] "Generic (PLEG): container finished" podID="06cbbbf9-ba64-4343-bde4-61db8b81e2d8" containerID="eeeb3bf4d0da24759531d7eab23f781a4c354731192bcc208b01ead87fd9cf8c" exitCode=2 Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.962032 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"06cbbbf9-ba64-4343-bde4-61db8b81e2d8","Type":"ContainerDied","Data":"eeeb3bf4d0da24759531d7eab23f781a4c354731192bcc208b01ead87fd9cf8c"} Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.962129 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"06cbbbf9-ba64-4343-bde4-61db8b81e2d8","Type":"ContainerDied","Data":"ff0fc197cca6d40eb8b1770e4810ff88538fd34242a251cca4d42fb30b468318"} Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.962230 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.970218 4775 generic.go:334] "Generic (PLEG): container finished" podID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerID="3b0eeed3eb498d88f58a831fd428e32689aed36255274f4595928f2573627371" exitCode=0 Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.970273 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc6d50e-df14-4345-8522-7b3f41d8b956","Type":"ContainerDied","Data":"3b0eeed3eb498d88f58a831fd428e32689aed36255274f4595928f2573627371"} Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.970309 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc6d50e-df14-4345-8522-7b3f41d8b956","Type":"ContainerDied","Data":"7b179467391399b48434c3fbdda1e90c1a0a0e4b63d2315abe9619600442e05c"} Dec 16 15:18:10 crc kubenswrapper[4775]: I1216 15:18:10.970337 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.011750 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.013670 4775 scope.go:117] "RemoveContainer" containerID="8be6db6ccb0d882cb2cc26581e3cfaf9253da95ec5178fc48f9b138d3ad28ebb" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.045625 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.063723 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.071164 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 15:18:11 crc kubenswrapper[4775]: E1216 15:18:11.071642 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb27686-174d-4795-8859-af77a1964fb5" containerName="nova-api-api" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.071664 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb27686-174d-4795-8859-af77a1964fb5" containerName="nova-api-api" Dec 16 15:18:11 crc kubenswrapper[4775]: E1216 15:18:11.071691 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerName="ceilometer-notification-agent" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.071700 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerName="ceilometer-notification-agent" Dec 16 15:18:11 crc kubenswrapper[4775]: E1216 15:18:11.071720 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerName="proxy-httpd" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.071733 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerName="proxy-httpd" Dec 16 15:18:11 crc kubenswrapper[4775]: E1216 15:18:11.071749 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerName="ceilometer-central-agent" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.071756 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerName="ceilometer-central-agent" Dec 16 15:18:11 crc kubenswrapper[4775]: E1216 15:18:11.071770 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerName="sg-core" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.071776 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerName="sg-core" Dec 16 15:18:11 crc kubenswrapper[4775]: E1216 15:18:11.071792 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb27686-174d-4795-8859-af77a1964fb5" containerName="nova-api-log" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.071798 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb27686-174d-4795-8859-af77a1964fb5" containerName="nova-api-log" Dec 16 15:18:11 crc kubenswrapper[4775]: E1216 15:18:11.071808 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06cbbbf9-ba64-4343-bde4-61db8b81e2d8" containerName="kube-state-metrics" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.071815 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="06cbbbf9-ba64-4343-bde4-61db8b81e2d8" containerName="kube-state-metrics" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.072045 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="06cbbbf9-ba64-4343-bde4-61db8b81e2d8" containerName="kube-state-metrics" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.072059 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerName="ceilometer-notification-agent" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.072071 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb27686-174d-4795-8859-af77a1964fb5" containerName="nova-api-api" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.072087 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerName="ceilometer-central-agent" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.072103 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerName="sg-core" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.072117 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb27686-174d-4795-8859-af77a1964fb5" containerName="nova-api-log" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.072125 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc6d50e-df14-4345-8522-7b3f41d8b956" containerName="proxy-httpd" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.073255 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.082678 4775 scope.go:117] "RemoveContainer" containerID="ddb2b9ef0d187c77d3a151851198409019ec187ee84dd8433587adfa1a980ece" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.082747 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.085263 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.085761 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 16 15:18:11 crc kubenswrapper[4775]: E1216 15:18:11.086830 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddb2b9ef0d187c77d3a151851198409019ec187ee84dd8433587adfa1a980ece\": container with ID starting with ddb2b9ef0d187c77d3a151851198409019ec187ee84dd8433587adfa1a980ece not found: ID does not exist" containerID="ddb2b9ef0d187c77d3a151851198409019ec187ee84dd8433587adfa1a980ece" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.087061 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddb2b9ef0d187c77d3a151851198409019ec187ee84dd8433587adfa1a980ece"} err="failed to get container status \"ddb2b9ef0d187c77d3a151851198409019ec187ee84dd8433587adfa1a980ece\": rpc error: code = NotFound desc = could not find container \"ddb2b9ef0d187c77d3a151851198409019ec187ee84dd8433587adfa1a980ece\": container with ID starting with ddb2b9ef0d187c77d3a151851198409019ec187ee84dd8433587adfa1a980ece not found: ID does not exist" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.087083 4775 scope.go:117] "RemoveContainer" containerID="8be6db6ccb0d882cb2cc26581e3cfaf9253da95ec5178fc48f9b138d3ad28ebb" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.087226 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 15:18:11 crc kubenswrapper[4775]: E1216 15:18:11.090380 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be6db6ccb0d882cb2cc26581e3cfaf9253da95ec5178fc48f9b138d3ad28ebb\": container with ID starting with 8be6db6ccb0d882cb2cc26581e3cfaf9253da95ec5178fc48f9b138d3ad28ebb not found: ID does not exist" containerID="8be6db6ccb0d882cb2cc26581e3cfaf9253da95ec5178fc48f9b138d3ad28ebb" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.090676 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be6db6ccb0d882cb2cc26581e3cfaf9253da95ec5178fc48f9b138d3ad28ebb"} err="failed to get container status \"8be6db6ccb0d882cb2cc26581e3cfaf9253da95ec5178fc48f9b138d3ad28ebb\": rpc error: code = NotFound desc = could not find container \"8be6db6ccb0d882cb2cc26581e3cfaf9253da95ec5178fc48f9b138d3ad28ebb\": container with ID starting with 8be6db6ccb0d882cb2cc26581e3cfaf9253da95ec5178fc48f9b138d3ad28ebb not found: ID does not exist" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.090719 4775 scope.go:117] "RemoveContainer" containerID="eeeb3bf4d0da24759531d7eab23f781a4c354731192bcc208b01ead87fd9cf8c" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.090551 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.099870 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.134693 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.135951 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.138624 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.138635 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-846p5" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.140447 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.149391 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.149703 4775 scope.go:117] "RemoveContainer" containerID="eeeb3bf4d0da24759531d7eab23f781a4c354731192bcc208b01ead87fd9cf8c" Dec 16 15:18:11 crc kubenswrapper[4775]: E1216 15:18:11.153327 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeeb3bf4d0da24759531d7eab23f781a4c354731192bcc208b01ead87fd9cf8c\": container with ID starting with eeeb3bf4d0da24759531d7eab23f781a4c354731192bcc208b01ead87fd9cf8c not found: ID does not exist" containerID="eeeb3bf4d0da24759531d7eab23f781a4c354731192bcc208b01ead87fd9cf8c" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.153366 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeeb3bf4d0da24759531d7eab23f781a4c354731192bcc208b01ead87fd9cf8c"} err="failed to get container status \"eeeb3bf4d0da24759531d7eab23f781a4c354731192bcc208b01ead87fd9cf8c\": rpc error: code = NotFound desc = could not find container \"eeeb3bf4d0da24759531d7eab23f781a4c354731192bcc208b01ead87fd9cf8c\": container with ID starting with eeeb3bf4d0da24759531d7eab23f781a4c354731192bcc208b01ead87fd9cf8c not found: ID does not exist" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.153402 4775 scope.go:117] "RemoveContainer" containerID="5a1c0c85db095fa1e0d5ee0c87112c68565475315a4fbddc94fb2925e3db2d9f" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.164868 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.167342 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.169846 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.170054 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.173832 4775 scope.go:117] "RemoveContainer" containerID="9711b307d831e6e8c07ec5073c58b51ec2d250a104f0aaf4b6856e3d4eb68a04" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.174994 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.189448 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.199128 4775 scope.go:117] "RemoveContainer" containerID="3b0eeed3eb498d88f58a831fd428e32689aed36255274f4595928f2573627371" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.210074 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kllkl\" (UniqueName: \"kubernetes.io/projected/350c79c3-b66c-4384-99db-437cba78dcd3-kube-api-access-kllkl\") pod \"nova-api-0\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.210136 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-public-tls-certs\") pod \"nova-api-0\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.210231 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-config-data\") pod \"nova-api-0\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.210283 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/350c79c3-b66c-4384-99db-437cba78dcd3-logs\") pod \"nova-api-0\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.210326 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.210379 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.223940 4775 scope.go:117] "RemoveContainer" containerID="abd091ea75c441b1b32f3067d5db38cc99820ebfe68250bc5f3dfd39608c940d" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.248099 4775 scope.go:117] "RemoveContainer" containerID="5a1c0c85db095fa1e0d5ee0c87112c68565475315a4fbddc94fb2925e3db2d9f" Dec 16 15:18:11 crc kubenswrapper[4775]: E1216 15:18:11.252250 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a1c0c85db095fa1e0d5ee0c87112c68565475315a4fbddc94fb2925e3db2d9f\": container with ID starting with 5a1c0c85db095fa1e0d5ee0c87112c68565475315a4fbddc94fb2925e3db2d9f not found: ID does not exist" containerID="5a1c0c85db095fa1e0d5ee0c87112c68565475315a4fbddc94fb2925e3db2d9f" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.252284 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a1c0c85db095fa1e0d5ee0c87112c68565475315a4fbddc94fb2925e3db2d9f"} err="failed to get container status \"5a1c0c85db095fa1e0d5ee0c87112c68565475315a4fbddc94fb2925e3db2d9f\": rpc error: code = NotFound desc = could not find container \"5a1c0c85db095fa1e0d5ee0c87112c68565475315a4fbddc94fb2925e3db2d9f\": container with ID starting with 5a1c0c85db095fa1e0d5ee0c87112c68565475315a4fbddc94fb2925e3db2d9f not found: ID does not exist" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.252310 4775 scope.go:117] "RemoveContainer" containerID="9711b307d831e6e8c07ec5073c58b51ec2d250a104f0aaf4b6856e3d4eb68a04" Dec 16 15:18:11 crc kubenswrapper[4775]: E1216 15:18:11.252666 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9711b307d831e6e8c07ec5073c58b51ec2d250a104f0aaf4b6856e3d4eb68a04\": container with ID starting with 9711b307d831e6e8c07ec5073c58b51ec2d250a104f0aaf4b6856e3d4eb68a04 not found: ID does not exist" containerID="9711b307d831e6e8c07ec5073c58b51ec2d250a104f0aaf4b6856e3d4eb68a04" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.252721 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9711b307d831e6e8c07ec5073c58b51ec2d250a104f0aaf4b6856e3d4eb68a04"} err="failed to get container status \"9711b307d831e6e8c07ec5073c58b51ec2d250a104f0aaf4b6856e3d4eb68a04\": rpc error: code = NotFound desc = could not find container \"9711b307d831e6e8c07ec5073c58b51ec2d250a104f0aaf4b6856e3d4eb68a04\": container with ID starting with 9711b307d831e6e8c07ec5073c58b51ec2d250a104f0aaf4b6856e3d4eb68a04 not found: ID does not exist" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.252759 4775 scope.go:117] "RemoveContainer" containerID="3b0eeed3eb498d88f58a831fd428e32689aed36255274f4595928f2573627371" Dec 16 15:18:11 crc kubenswrapper[4775]: E1216 15:18:11.255164 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b0eeed3eb498d88f58a831fd428e32689aed36255274f4595928f2573627371\": container with ID starting with 3b0eeed3eb498d88f58a831fd428e32689aed36255274f4595928f2573627371 not found: ID does not exist" containerID="3b0eeed3eb498d88f58a831fd428e32689aed36255274f4595928f2573627371" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.255211 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b0eeed3eb498d88f58a831fd428e32689aed36255274f4595928f2573627371"} err="failed to get container status \"3b0eeed3eb498d88f58a831fd428e32689aed36255274f4595928f2573627371\": rpc error: code = NotFound desc = could not find container \"3b0eeed3eb498d88f58a831fd428e32689aed36255274f4595928f2573627371\": container with ID starting with 3b0eeed3eb498d88f58a831fd428e32689aed36255274f4595928f2573627371 not found: ID does not exist" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.255249 4775 scope.go:117] "RemoveContainer" containerID="abd091ea75c441b1b32f3067d5db38cc99820ebfe68250bc5f3dfd39608c940d" Dec 16 15:18:11 crc kubenswrapper[4775]: E1216 15:18:11.255518 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abd091ea75c441b1b32f3067d5db38cc99820ebfe68250bc5f3dfd39608c940d\": container with ID starting with abd091ea75c441b1b32f3067d5db38cc99820ebfe68250bc5f3dfd39608c940d not found: ID does not exist" containerID="abd091ea75c441b1b32f3067d5db38cc99820ebfe68250bc5f3dfd39608c940d" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.255544 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abd091ea75c441b1b32f3067d5db38cc99820ebfe68250bc5f3dfd39608c940d"} err="failed to get container status \"abd091ea75c441b1b32f3067d5db38cc99820ebfe68250bc5f3dfd39608c940d\": rpc error: code = NotFound desc = could not find container \"abd091ea75c441b1b32f3067d5db38cc99820ebfe68250bc5f3dfd39608c940d\": container with ID starting with abd091ea75c441b1b32f3067d5db38cc99820ebfe68250bc5f3dfd39608c940d not found: ID does not exist" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.311906 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.311973 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-config-data\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.312063 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r89xn\" (UniqueName: \"kubernetes.io/projected/dea2122b-dae9-4775-ab5c-b10760260231-kube-api-access-r89xn\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.312679 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.312841 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.312947 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.312984 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kllkl\" (UniqueName: \"kubernetes.io/projected/350c79c3-b66c-4384-99db-437cba78dcd3-kube-api-access-kllkl\") pod \"nova-api-0\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.313017 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-public-tls-certs\") pod \"nova-api-0\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.313046 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59l6t\" (UniqueName: \"kubernetes.io/projected/44120d84-ab08-40cb-ad82-59518b6f55b2-kube-api-access-59l6t\") pod \"kube-state-metrics-0\" (UID: \"44120d84-ab08-40cb-ad82-59518b6f55b2\") " pod="openstack/kube-state-metrics-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.313085 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/44120d84-ab08-40cb-ad82-59518b6f55b2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"44120d84-ab08-40cb-ad82-59518b6f55b2\") " pod="openstack/kube-state-metrics-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.313104 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dea2122b-dae9-4775-ab5c-b10760260231-log-httpd\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.313143 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dea2122b-dae9-4775-ab5c-b10760260231-run-httpd\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.313185 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/44120d84-ab08-40cb-ad82-59518b6f55b2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"44120d84-ab08-40cb-ad82-59518b6f55b2\") " pod="openstack/kube-state-metrics-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.313212 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-config-data\") pod \"nova-api-0\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.313236 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44120d84-ab08-40cb-ad82-59518b6f55b2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"44120d84-ab08-40cb-ad82-59518b6f55b2\") " pod="openstack/kube-state-metrics-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.313261 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/350c79c3-b66c-4384-99db-437cba78dcd3-logs\") pod \"nova-api-0\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.313279 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-scripts\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.313709 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/350c79c3-b66c-4384-99db-437cba78dcd3-logs\") pod \"nova-api-0\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.316799 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.317368 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.318640 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-public-tls-certs\") pod \"nova-api-0\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.319122 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-config-data\") pod \"nova-api-0\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.328914 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kllkl\" (UniqueName: \"kubernetes.io/projected/350c79c3-b66c-4384-99db-437cba78dcd3-kube-api-access-kllkl\") pod \"nova-api-0\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.366285 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06cbbbf9-ba64-4343-bde4-61db8b81e2d8" path="/var/lib/kubelet/pods/06cbbbf9-ba64-4343-bde4-61db8b81e2d8/volumes" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.367289 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dc6d50e-df14-4345-8522-7b3f41d8b956" path="/var/lib/kubelet/pods/5dc6d50e-df14-4345-8522-7b3f41d8b956/volumes" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.368275 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acb27686-174d-4795-8859-af77a1964fb5" path="/var/lib/kubelet/pods/acb27686-174d-4795-8859-af77a1964fb5/volumes" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.410813 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.415440 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/44120d84-ab08-40cb-ad82-59518b6f55b2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"44120d84-ab08-40cb-ad82-59518b6f55b2\") " pod="openstack/kube-state-metrics-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.415501 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dea2122b-dae9-4775-ab5c-b10760260231-log-httpd\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.415544 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dea2122b-dae9-4775-ab5c-b10760260231-run-httpd\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.415636 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/44120d84-ab08-40cb-ad82-59518b6f55b2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"44120d84-ab08-40cb-ad82-59518b6f55b2\") " pod="openstack/kube-state-metrics-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.415698 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44120d84-ab08-40cb-ad82-59518b6f55b2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"44120d84-ab08-40cb-ad82-59518b6f55b2\") " pod="openstack/kube-state-metrics-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.415741 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-scripts\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.415801 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-config-data\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.415906 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r89xn\" (UniqueName: \"kubernetes.io/projected/dea2122b-dae9-4775-ab5c-b10760260231-kube-api-access-r89xn\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.415979 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.416059 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.416136 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59l6t\" (UniqueName: \"kubernetes.io/projected/44120d84-ab08-40cb-ad82-59518b6f55b2-kube-api-access-59l6t\") pod \"kube-state-metrics-0\" (UID: \"44120d84-ab08-40cb-ad82-59518b6f55b2\") " pod="openstack/kube-state-metrics-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.417004 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dea2122b-dae9-4775-ab5c-b10760260231-run-httpd\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.417251 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dea2122b-dae9-4775-ab5c-b10760260231-log-httpd\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.421565 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.423375 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.424707 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-scripts\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.431533 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/44120d84-ab08-40cb-ad82-59518b6f55b2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"44120d84-ab08-40cb-ad82-59518b6f55b2\") " pod="openstack/kube-state-metrics-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.433588 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44120d84-ab08-40cb-ad82-59518b6f55b2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"44120d84-ab08-40cb-ad82-59518b6f55b2\") " pod="openstack/kube-state-metrics-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.433648 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/44120d84-ab08-40cb-ad82-59518b6f55b2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"44120d84-ab08-40cb-ad82-59518b6f55b2\") " pod="openstack/kube-state-metrics-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.439270 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59l6t\" (UniqueName: \"kubernetes.io/projected/44120d84-ab08-40cb-ad82-59518b6f55b2-kube-api-access-59l6t\") pod \"kube-state-metrics-0\" (UID: \"44120d84-ab08-40cb-ad82-59518b6f55b2\") " pod="openstack/kube-state-metrics-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.441373 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r89xn\" (UniqueName: \"kubernetes.io/projected/dea2122b-dae9-4775-ab5c-b10760260231-kube-api-access-r89xn\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.441434 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-config-data\") pod \"ceilometer-0\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.457870 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.483614 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.887232 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.984758 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"350c79c3-b66c-4384-99db-437cba78dcd3","Type":"ContainerStarted","Data":"6a0ca071fc42570ff50802a261b1a2a520de8a25ecc8345904a44a534f2603b6"} Dec 16 15:18:11 crc kubenswrapper[4775]: I1216 15:18:11.999763 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 15:18:12 crc kubenswrapper[4775]: W1216 15:18:12.005452 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44120d84_ab08_40cb_ad82_59518b6f55b2.slice/crio-e6410b8854f0ad2243937173a5881c727fc7c9e8d50c54257284bf308e5cfaf1 WatchSource:0}: Error finding container e6410b8854f0ad2243937173a5881c727fc7c9e8d50c54257284bf308e5cfaf1: Status 404 returned error can't find the container with id e6410b8854f0ad2243937173a5881c727fc7c9e8d50c54257284bf308e5cfaf1 Dec 16 15:18:12 crc kubenswrapper[4775]: I1216 15:18:12.098729 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:18:12 crc kubenswrapper[4775]: W1216 15:18:12.106265 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddea2122b_dae9_4775_ab5c_b10760260231.slice/crio-0ca7f5e1cfb5e064e6c212355969456bfa6a79b6e375250b7d803681e0f8cda9 WatchSource:0}: Error finding container 0ca7f5e1cfb5e064e6c212355969456bfa6a79b6e375250b7d803681e0f8cda9: Status 404 returned error can't find the container with id 0ca7f5e1cfb5e064e6c212355969456bfa6a79b6e375250b7d803681e0f8cda9 Dec 16 15:18:12 crc kubenswrapper[4775]: I1216 15:18:12.174200 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:18:13 crc kubenswrapper[4775]: I1216 15:18:13.004615 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"44120d84-ab08-40cb-ad82-59518b6f55b2","Type":"ContainerStarted","Data":"3e3d102c59bca82bf066f0f453fdc1b37d9f3d717302cd2d66e130103bb69031"} Dec 16 15:18:13 crc kubenswrapper[4775]: I1216 15:18:13.005021 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"44120d84-ab08-40cb-ad82-59518b6f55b2","Type":"ContainerStarted","Data":"e6410b8854f0ad2243937173a5881c727fc7c9e8d50c54257284bf308e5cfaf1"} Dec 16 15:18:13 crc kubenswrapper[4775]: I1216 15:18:13.005217 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 16 15:18:13 crc kubenswrapper[4775]: I1216 15:18:13.006925 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dea2122b-dae9-4775-ab5c-b10760260231","Type":"ContainerStarted","Data":"0ca7f5e1cfb5e064e6c212355969456bfa6a79b6e375250b7d803681e0f8cda9"} Dec 16 15:18:13 crc kubenswrapper[4775]: I1216 15:18:13.008746 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"350c79c3-b66c-4384-99db-437cba78dcd3","Type":"ContainerStarted","Data":"777049e4802dd7b16cc97be573df5ff26fe3c5f656db253adfa39b1303276035"} Dec 16 15:18:13 crc kubenswrapper[4775]: I1216 15:18:13.008779 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"350c79c3-b66c-4384-99db-437cba78dcd3","Type":"ContainerStarted","Data":"fe1e01275823875199399a3ac389cd4adf20983d2d62db98a73751595bce26f4"} Dec 16 15:18:13 crc kubenswrapper[4775]: I1216 15:18:13.033260 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.655778872 podStartE2EDuration="2.033239122s" podCreationTimestamp="2025-12-16 15:18:11 +0000 UTC" firstStartedPulling="2025-12-16 15:18:12.014003687 +0000 UTC m=+1416.965082610" lastFinishedPulling="2025-12-16 15:18:12.391463937 +0000 UTC m=+1417.342542860" observedRunningTime="2025-12-16 15:18:13.02261319 +0000 UTC m=+1417.973692123" watchObservedRunningTime="2025-12-16 15:18:13.033239122 +0000 UTC m=+1417.984318045" Dec 16 15:18:13 crc kubenswrapper[4775]: I1216 15:18:13.059239 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.059216609 podStartE2EDuration="3.059216609s" podCreationTimestamp="2025-12-16 15:18:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:18:13.048140282 +0000 UTC m=+1417.999219215" watchObservedRunningTime="2025-12-16 15:18:13.059216609 +0000 UTC m=+1418.010295532" Dec 16 15:18:13 crc kubenswrapper[4775]: I1216 15:18:13.280986 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:13 crc kubenswrapper[4775]: I1216 15:18:13.298738 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.025485 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dea2122b-dae9-4775-ab5c-b10760260231","Type":"ContainerStarted","Data":"1298c7bbb8e1491cd68cf1ce0fc4d17637c2cbfbb35bdf63a1502dff091919e7"} Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.025940 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dea2122b-dae9-4775-ab5c-b10760260231","Type":"ContainerStarted","Data":"2b40cd0fc94548764448da9eda628eb441eab64010fde82204c1fd8e4137cba0"} Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.046426 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.255549 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-bdkkf"] Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.257477 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bdkkf" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.260534 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.265193 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.270042 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-bdkkf"] Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.377042 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bdkkf\" (UID: \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\") " pod="openstack/nova-cell1-cell-mapping-bdkkf" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.377133 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-config-data\") pod \"nova-cell1-cell-mapping-bdkkf\" (UID: \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\") " pod="openstack/nova-cell1-cell-mapping-bdkkf" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.377172 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4vh8\" (UniqueName: \"kubernetes.io/projected/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-kube-api-access-r4vh8\") pod \"nova-cell1-cell-mapping-bdkkf\" (UID: \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\") " pod="openstack/nova-cell1-cell-mapping-bdkkf" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.377800 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-scripts\") pod \"nova-cell1-cell-mapping-bdkkf\" (UID: \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\") " pod="openstack/nova-cell1-cell-mapping-bdkkf" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.479382 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-config-data\") pod \"nova-cell1-cell-mapping-bdkkf\" (UID: \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\") " pod="openstack/nova-cell1-cell-mapping-bdkkf" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.479429 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4vh8\" (UniqueName: \"kubernetes.io/projected/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-kube-api-access-r4vh8\") pod \"nova-cell1-cell-mapping-bdkkf\" (UID: \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\") " pod="openstack/nova-cell1-cell-mapping-bdkkf" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.479572 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-scripts\") pod \"nova-cell1-cell-mapping-bdkkf\" (UID: \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\") " pod="openstack/nova-cell1-cell-mapping-bdkkf" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.479596 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bdkkf\" (UID: \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\") " pod="openstack/nova-cell1-cell-mapping-bdkkf" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.487849 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-config-data\") pod \"nova-cell1-cell-mapping-bdkkf\" (UID: \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\") " pod="openstack/nova-cell1-cell-mapping-bdkkf" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.488635 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bdkkf\" (UID: \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\") " pod="openstack/nova-cell1-cell-mapping-bdkkf" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.498515 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-scripts\") pod \"nova-cell1-cell-mapping-bdkkf\" (UID: \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\") " pod="openstack/nova-cell1-cell-mapping-bdkkf" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.505509 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4vh8\" (UniqueName: \"kubernetes.io/projected/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-kube-api-access-r4vh8\") pod \"nova-cell1-cell-mapping-bdkkf\" (UID: \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\") " pod="openstack/nova-cell1-cell-mapping-bdkkf" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.517021 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.582232 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bdkkf" Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.619001 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-6tmts"] Dec 16 15:18:14 crc kubenswrapper[4775]: I1216 15:18:14.619504 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9b86998b5-6tmts" podUID="f9a78807-ba42-4640-94be-a98bc08000a6" containerName="dnsmasq-dns" containerID="cri-o://a1274319190212b94cc34588b38f69ef4340276bcaadf212aefc0a4f4ab6e4ab" gracePeriod=10 Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.047257 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dea2122b-dae9-4775-ab5c-b10760260231","Type":"ContainerStarted","Data":"f291d494b8e944fa236a5dc1bf55b8664e50491a176b87cbb20267e1ec08da13"} Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.049785 4775 generic.go:334] "Generic (PLEG): container finished" podID="f9a78807-ba42-4640-94be-a98bc08000a6" containerID="a1274319190212b94cc34588b38f69ef4340276bcaadf212aefc0a4f4ab6e4ab" exitCode=0 Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.050052 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-6tmts" event={"ID":"f9a78807-ba42-4640-94be-a98bc08000a6","Type":"ContainerDied","Data":"a1274319190212b94cc34588b38f69ef4340276bcaadf212aefc0a4f4ab6e4ab"} Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.113570 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-bdkkf"] Dec 16 15:18:15 crc kubenswrapper[4775]: W1216 15:18:15.122781 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf16cf76e_4507_4b0a_aefd_c32b2b0763f1.slice/crio-b55dee4500c16b4b430f13822e947214dd17b88b9cbf6063ef647218ec1b0d4d WatchSource:0}: Error finding container b55dee4500c16b4b430f13822e947214dd17b88b9cbf6063ef647218ec1b0d4d: Status 404 returned error can't find the container with id b55dee4500c16b4b430f13822e947214dd17b88b9cbf6063ef647218ec1b0d4d Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.195399 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.299910 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-config\") pod \"f9a78807-ba42-4640-94be-a98bc08000a6\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.300348 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-ovsdbserver-nb\") pod \"f9a78807-ba42-4640-94be-a98bc08000a6\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.300515 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-ovsdbserver-sb\") pod \"f9a78807-ba42-4640-94be-a98bc08000a6\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.300548 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mf89\" (UniqueName: \"kubernetes.io/projected/f9a78807-ba42-4640-94be-a98bc08000a6-kube-api-access-7mf89\") pod \"f9a78807-ba42-4640-94be-a98bc08000a6\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.300639 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-dns-svc\") pod \"f9a78807-ba42-4640-94be-a98bc08000a6\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.300699 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-dns-swift-storage-0\") pod \"f9a78807-ba42-4640-94be-a98bc08000a6\" (UID: \"f9a78807-ba42-4640-94be-a98bc08000a6\") " Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.309099 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9a78807-ba42-4640-94be-a98bc08000a6-kube-api-access-7mf89" (OuterVolumeSpecName: "kube-api-access-7mf89") pod "f9a78807-ba42-4640-94be-a98bc08000a6" (UID: "f9a78807-ba42-4640-94be-a98bc08000a6"). InnerVolumeSpecName "kube-api-access-7mf89". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.363792 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-config" (OuterVolumeSpecName: "config") pod "f9a78807-ba42-4640-94be-a98bc08000a6" (UID: "f9a78807-ba42-4640-94be-a98bc08000a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.366232 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9a78807-ba42-4640-94be-a98bc08000a6" (UID: "f9a78807-ba42-4640-94be-a98bc08000a6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.390918 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f9a78807-ba42-4640-94be-a98bc08000a6" (UID: "f9a78807-ba42-4640-94be-a98bc08000a6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.393418 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9a78807-ba42-4640-94be-a98bc08000a6" (UID: "f9a78807-ba42-4640-94be-a98bc08000a6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.403254 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.403287 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mf89\" (UniqueName: \"kubernetes.io/projected/f9a78807-ba42-4640-94be-a98bc08000a6-kube-api-access-7mf89\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.403298 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.403310 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.403319 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.419930 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9a78807-ba42-4640-94be-a98bc08000a6" (UID: "f9a78807-ba42-4640-94be-a98bc08000a6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:15 crc kubenswrapper[4775]: I1216 15:18:15.505818 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9a78807-ba42-4640-94be-a98bc08000a6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:16 crc kubenswrapper[4775]: I1216 15:18:16.059130 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-6tmts" event={"ID":"f9a78807-ba42-4640-94be-a98bc08000a6","Type":"ContainerDied","Data":"9ac916dcde6794e69e9329758abfcb551a1adad3c1fdcfac7d28a8b8524ea861"} Dec 16 15:18:16 crc kubenswrapper[4775]: I1216 15:18:16.059220 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-6tmts" Dec 16 15:18:16 crc kubenswrapper[4775]: I1216 15:18:16.059435 4775 scope.go:117] "RemoveContainer" containerID="a1274319190212b94cc34588b38f69ef4340276bcaadf212aefc0a4f4ab6e4ab" Dec 16 15:18:16 crc kubenswrapper[4775]: I1216 15:18:16.060627 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bdkkf" event={"ID":"f16cf76e-4507-4b0a-aefd-c32b2b0763f1","Type":"ContainerStarted","Data":"63525ee91b85db663a55315ccf39419df3bc609346f6df513788091ea6679d1d"} Dec 16 15:18:16 crc kubenswrapper[4775]: I1216 15:18:16.060652 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bdkkf" event={"ID":"f16cf76e-4507-4b0a-aefd-c32b2b0763f1","Type":"ContainerStarted","Data":"b55dee4500c16b4b430f13822e947214dd17b88b9cbf6063ef647218ec1b0d4d"} Dec 16 15:18:16 crc kubenswrapper[4775]: I1216 15:18:16.084373 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-bdkkf" podStartSLOduration=2.084351967 podStartE2EDuration="2.084351967s" podCreationTimestamp="2025-12-16 15:18:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:18:16.077257338 +0000 UTC m=+1421.028336271" watchObservedRunningTime="2025-12-16 15:18:16.084351967 +0000 UTC m=+1421.035430890" Dec 16 15:18:16 crc kubenswrapper[4775]: I1216 15:18:16.327585 4775 scope.go:117] "RemoveContainer" containerID="fe2c60dc29a4084cf5e04cc3e2c95243300464cd744c30848b14cf94af7eca09" Dec 16 15:18:16 crc kubenswrapper[4775]: I1216 15:18:16.343574 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-6tmts"] Dec 16 15:18:16 crc kubenswrapper[4775]: I1216 15:18:16.356046 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-6tmts"] Dec 16 15:18:17 crc kubenswrapper[4775]: I1216 15:18:17.072182 4775 generic.go:334] "Generic (PLEG): container finished" podID="dea2122b-dae9-4775-ab5c-b10760260231" containerID="b425de9c356e711920f83512dd9fb63d59114844d5aa232e8c5e42b89fb6b63b" exitCode=1 Dec 16 15:18:17 crc kubenswrapper[4775]: I1216 15:18:17.072281 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dea2122b-dae9-4775-ab5c-b10760260231","Type":"ContainerDied","Data":"b425de9c356e711920f83512dd9fb63d59114844d5aa232e8c5e42b89fb6b63b"} Dec 16 15:18:17 crc kubenswrapper[4775]: I1216 15:18:17.072319 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dea2122b-dae9-4775-ab5c-b10760260231" containerName="ceilometer-central-agent" containerID="cri-o://2b40cd0fc94548764448da9eda628eb441eab64010fde82204c1fd8e4137cba0" gracePeriod=30 Dec 16 15:18:17 crc kubenswrapper[4775]: I1216 15:18:17.072348 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dea2122b-dae9-4775-ab5c-b10760260231" containerName="sg-core" containerID="cri-o://f291d494b8e944fa236a5dc1bf55b8664e50491a176b87cbb20267e1ec08da13" gracePeriod=30 Dec 16 15:18:17 crc kubenswrapper[4775]: I1216 15:18:17.072362 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dea2122b-dae9-4775-ab5c-b10760260231" containerName="ceilometer-notification-agent" containerID="cri-o://1298c7bbb8e1491cd68cf1ce0fc4d17637c2cbfbb35bdf63a1502dff091919e7" gracePeriod=30 Dec 16 15:18:17 crc kubenswrapper[4775]: I1216 15:18:17.347549 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9a78807-ba42-4640-94be-a98bc08000a6" path="/var/lib/kubelet/pods/f9a78807-ba42-4640-94be-a98bc08000a6/volumes" Dec 16 15:18:18 crc kubenswrapper[4775]: I1216 15:18:18.087064 4775 generic.go:334] "Generic (PLEG): container finished" podID="dea2122b-dae9-4775-ab5c-b10760260231" containerID="f291d494b8e944fa236a5dc1bf55b8664e50491a176b87cbb20267e1ec08da13" exitCode=2 Dec 16 15:18:18 crc kubenswrapper[4775]: I1216 15:18:18.087756 4775 generic.go:334] "Generic (PLEG): container finished" podID="dea2122b-dae9-4775-ab5c-b10760260231" containerID="1298c7bbb8e1491cd68cf1ce0fc4d17637c2cbfbb35bdf63a1502dff091919e7" exitCode=0 Dec 16 15:18:18 crc kubenswrapper[4775]: I1216 15:18:18.087141 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dea2122b-dae9-4775-ab5c-b10760260231","Type":"ContainerDied","Data":"f291d494b8e944fa236a5dc1bf55b8664e50491a176b87cbb20267e1ec08da13"} Dec 16 15:18:18 crc kubenswrapper[4775]: I1216 15:18:18.087959 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dea2122b-dae9-4775-ab5c-b10760260231","Type":"ContainerDied","Data":"1298c7bbb8e1491cd68cf1ce0fc4d17637c2cbfbb35bdf63a1502dff091919e7"} Dec 16 15:18:20 crc kubenswrapper[4775]: I1216 15:18:20.107925 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bdkkf" event={"ID":"f16cf76e-4507-4b0a-aefd-c32b2b0763f1","Type":"ContainerDied","Data":"63525ee91b85db663a55315ccf39419df3bc609346f6df513788091ea6679d1d"} Dec 16 15:18:20 crc kubenswrapper[4775]: I1216 15:18:20.107931 4775 generic.go:334] "Generic (PLEG): container finished" podID="f16cf76e-4507-4b0a-aefd-c32b2b0763f1" containerID="63525ee91b85db663a55315ccf39419df3bc609346f6df513788091ea6679d1d" exitCode=0 Dec 16 15:18:20 crc kubenswrapper[4775]: I1216 15:18:20.153532 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-9b86998b5-6tmts" podUID="f9a78807-ba42-4640-94be-a98bc08000a6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.193:5353: i/o timeout" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.122317 4775 generic.go:334] "Generic (PLEG): container finished" podID="dea2122b-dae9-4775-ab5c-b10760260231" containerID="2b40cd0fc94548764448da9eda628eb441eab64010fde82204c1fd8e4137cba0" exitCode=0 Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.122855 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dea2122b-dae9-4775-ab5c-b10760260231","Type":"ContainerDied","Data":"2b40cd0fc94548764448da9eda628eb441eab64010fde82204c1fd8e4137cba0"} Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.284085 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.412264 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.412511 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.417180 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-sg-core-conf-yaml\") pod \"dea2122b-dae9-4775-ab5c-b10760260231\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.417231 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dea2122b-dae9-4775-ab5c-b10760260231-run-httpd\") pod \"dea2122b-dae9-4775-ab5c-b10760260231\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.417361 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-combined-ca-bundle\") pod \"dea2122b-dae9-4775-ab5c-b10760260231\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.417426 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-config-data\") pod \"dea2122b-dae9-4775-ab5c-b10760260231\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.417453 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-scripts\") pod \"dea2122b-dae9-4775-ab5c-b10760260231\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.417489 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r89xn\" (UniqueName: \"kubernetes.io/projected/dea2122b-dae9-4775-ab5c-b10760260231-kube-api-access-r89xn\") pod \"dea2122b-dae9-4775-ab5c-b10760260231\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.417518 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dea2122b-dae9-4775-ab5c-b10760260231-log-httpd\") pod \"dea2122b-dae9-4775-ab5c-b10760260231\" (UID: \"dea2122b-dae9-4775-ab5c-b10760260231\") " Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.418996 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dea2122b-dae9-4775-ab5c-b10760260231-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dea2122b-dae9-4775-ab5c-b10760260231" (UID: "dea2122b-dae9-4775-ab5c-b10760260231"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.419148 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dea2122b-dae9-4775-ab5c-b10760260231-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dea2122b-dae9-4775-ab5c-b10760260231" (UID: "dea2122b-dae9-4775-ab5c-b10760260231"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.424086 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dea2122b-dae9-4775-ab5c-b10760260231-kube-api-access-r89xn" (OuterVolumeSpecName: "kube-api-access-r89xn") pod "dea2122b-dae9-4775-ab5c-b10760260231" (UID: "dea2122b-dae9-4775-ab5c-b10760260231"). InnerVolumeSpecName "kube-api-access-r89xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.424488 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-scripts" (OuterVolumeSpecName: "scripts") pod "dea2122b-dae9-4775-ab5c-b10760260231" (UID: "dea2122b-dae9-4775-ab5c-b10760260231"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.447423 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dea2122b-dae9-4775-ab5c-b10760260231" (UID: "dea2122b-dae9-4775-ab5c-b10760260231"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.473152 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.522241 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.522265 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dea2122b-dae9-4775-ab5c-b10760260231-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.522275 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.522283 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r89xn\" (UniqueName: \"kubernetes.io/projected/dea2122b-dae9-4775-ab5c-b10760260231-kube-api-access-r89xn\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.522292 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dea2122b-dae9-4775-ab5c-b10760260231-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.531694 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dea2122b-dae9-4775-ab5c-b10760260231" (UID: "dea2122b-dae9-4775-ab5c-b10760260231"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.542035 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bdkkf" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.571260 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-config-data" (OuterVolumeSpecName: "config-data") pod "dea2122b-dae9-4775-ab5c-b10760260231" (UID: "dea2122b-dae9-4775-ab5c-b10760260231"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.623716 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-config-data\") pod \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\" (UID: \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\") " Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.623801 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-combined-ca-bundle\") pod \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\" (UID: \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\") " Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.623868 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-scripts\") pod \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\" (UID: \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\") " Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.624002 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4vh8\" (UniqueName: \"kubernetes.io/projected/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-kube-api-access-r4vh8\") pod \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\" (UID: \"f16cf76e-4507-4b0a-aefd-c32b2b0763f1\") " Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.624459 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.624479 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dea2122b-dae9-4775-ab5c-b10760260231-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.626851 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-kube-api-access-r4vh8" (OuterVolumeSpecName: "kube-api-access-r4vh8") pod "f16cf76e-4507-4b0a-aefd-c32b2b0763f1" (UID: "f16cf76e-4507-4b0a-aefd-c32b2b0763f1"). InnerVolumeSpecName "kube-api-access-r4vh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.627234 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-scripts" (OuterVolumeSpecName: "scripts") pod "f16cf76e-4507-4b0a-aefd-c32b2b0763f1" (UID: "f16cf76e-4507-4b0a-aefd-c32b2b0763f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.653716 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-config-data" (OuterVolumeSpecName: "config-data") pod "f16cf76e-4507-4b0a-aefd-c32b2b0763f1" (UID: "f16cf76e-4507-4b0a-aefd-c32b2b0763f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.654295 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f16cf76e-4507-4b0a-aefd-c32b2b0763f1" (UID: "f16cf76e-4507-4b0a-aefd-c32b2b0763f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.726426 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4vh8\" (UniqueName: \"kubernetes.io/projected/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-kube-api-access-r4vh8\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.726736 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.726822 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.726919 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f16cf76e-4507-4b0a-aefd-c32b2b0763f1-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.781671 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s28hs"] Dec 16 15:18:21 crc kubenswrapper[4775]: E1216 15:18:21.782183 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a78807-ba42-4640-94be-a98bc08000a6" containerName="init" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.782206 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a78807-ba42-4640-94be-a98bc08000a6" containerName="init" Dec 16 15:18:21 crc kubenswrapper[4775]: E1216 15:18:21.782220 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea2122b-dae9-4775-ab5c-b10760260231" containerName="proxy-httpd" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.782227 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea2122b-dae9-4775-ab5c-b10760260231" containerName="proxy-httpd" Dec 16 15:18:21 crc kubenswrapper[4775]: E1216 15:18:21.782241 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea2122b-dae9-4775-ab5c-b10760260231" containerName="ceilometer-central-agent" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.782250 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea2122b-dae9-4775-ab5c-b10760260231" containerName="ceilometer-central-agent" Dec 16 15:18:21 crc kubenswrapper[4775]: E1216 15:18:21.782275 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea2122b-dae9-4775-ab5c-b10760260231" containerName="sg-core" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.782283 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea2122b-dae9-4775-ab5c-b10760260231" containerName="sg-core" Dec 16 15:18:21 crc kubenswrapper[4775]: E1216 15:18:21.782308 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a78807-ba42-4640-94be-a98bc08000a6" containerName="dnsmasq-dns" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.782319 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a78807-ba42-4640-94be-a98bc08000a6" containerName="dnsmasq-dns" Dec 16 15:18:21 crc kubenswrapper[4775]: E1216 15:18:21.782333 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea2122b-dae9-4775-ab5c-b10760260231" containerName="ceilometer-notification-agent" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.782341 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea2122b-dae9-4775-ab5c-b10760260231" containerName="ceilometer-notification-agent" Dec 16 15:18:21 crc kubenswrapper[4775]: E1216 15:18:21.782375 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f16cf76e-4507-4b0a-aefd-c32b2b0763f1" containerName="nova-manage" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.782383 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f16cf76e-4507-4b0a-aefd-c32b2b0763f1" containerName="nova-manage" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.782615 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea2122b-dae9-4775-ab5c-b10760260231" containerName="sg-core" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.782630 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea2122b-dae9-4775-ab5c-b10760260231" containerName="proxy-httpd" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.782643 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea2122b-dae9-4775-ab5c-b10760260231" containerName="ceilometer-notification-agent" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.782673 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea2122b-dae9-4775-ab5c-b10760260231" containerName="ceilometer-central-agent" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.782688 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f16cf76e-4507-4b0a-aefd-c32b2b0763f1" containerName="nova-manage" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.782700 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9a78807-ba42-4640-94be-a98bc08000a6" containerName="dnsmasq-dns" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.784098 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s28hs" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.811575 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s28hs"] Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.930594 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w64zn\" (UniqueName: \"kubernetes.io/projected/e182acf8-e0f8-4ad4-b91f-0028568a79c3-kube-api-access-w64zn\") pod \"redhat-operators-s28hs\" (UID: \"e182acf8-e0f8-4ad4-b91f-0028568a79c3\") " pod="openshift-marketplace/redhat-operators-s28hs" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.930746 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e182acf8-e0f8-4ad4-b91f-0028568a79c3-utilities\") pod \"redhat-operators-s28hs\" (UID: \"e182acf8-e0f8-4ad4-b91f-0028568a79c3\") " pod="openshift-marketplace/redhat-operators-s28hs" Dec 16 15:18:21 crc kubenswrapper[4775]: I1216 15:18:21.930978 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e182acf8-e0f8-4ad4-b91f-0028568a79c3-catalog-content\") pod \"redhat-operators-s28hs\" (UID: \"e182acf8-e0f8-4ad4-b91f-0028568a79c3\") " pod="openshift-marketplace/redhat-operators-s28hs" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.033058 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w64zn\" (UniqueName: \"kubernetes.io/projected/e182acf8-e0f8-4ad4-b91f-0028568a79c3-kube-api-access-w64zn\") pod \"redhat-operators-s28hs\" (UID: \"e182acf8-e0f8-4ad4-b91f-0028568a79c3\") " pod="openshift-marketplace/redhat-operators-s28hs" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.033155 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e182acf8-e0f8-4ad4-b91f-0028568a79c3-utilities\") pod \"redhat-operators-s28hs\" (UID: \"e182acf8-e0f8-4ad4-b91f-0028568a79c3\") " pod="openshift-marketplace/redhat-operators-s28hs" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.033262 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e182acf8-e0f8-4ad4-b91f-0028568a79c3-catalog-content\") pod \"redhat-operators-s28hs\" (UID: \"e182acf8-e0f8-4ad4-b91f-0028568a79c3\") " pod="openshift-marketplace/redhat-operators-s28hs" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.033686 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e182acf8-e0f8-4ad4-b91f-0028568a79c3-utilities\") pod \"redhat-operators-s28hs\" (UID: \"e182acf8-e0f8-4ad4-b91f-0028568a79c3\") " pod="openshift-marketplace/redhat-operators-s28hs" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.033792 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e182acf8-e0f8-4ad4-b91f-0028568a79c3-catalog-content\") pod \"redhat-operators-s28hs\" (UID: \"e182acf8-e0f8-4ad4-b91f-0028568a79c3\") " pod="openshift-marketplace/redhat-operators-s28hs" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.062778 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w64zn\" (UniqueName: \"kubernetes.io/projected/e182acf8-e0f8-4ad4-b91f-0028568a79c3-kube-api-access-w64zn\") pod \"redhat-operators-s28hs\" (UID: \"e182acf8-e0f8-4ad4-b91f-0028568a79c3\") " pod="openshift-marketplace/redhat-operators-s28hs" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.131961 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s28hs" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.141997 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bdkkf" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.142017 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bdkkf" event={"ID":"f16cf76e-4507-4b0a-aefd-c32b2b0763f1","Type":"ContainerDied","Data":"b55dee4500c16b4b430f13822e947214dd17b88b9cbf6063ef647218ec1b0d4d"} Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.142077 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b55dee4500c16b4b430f13822e947214dd17b88b9cbf6063ef647218ec1b0d4d" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.149279 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dea2122b-dae9-4775-ab5c-b10760260231","Type":"ContainerDied","Data":"0ca7f5e1cfb5e064e6c212355969456bfa6a79b6e375250b7d803681e0f8cda9"} Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.149748 4775 scope.go:117] "RemoveContainer" containerID="b425de9c356e711920f83512dd9fb63d59114844d5aa232e8c5e42b89fb6b63b" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.149675 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.344269 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.369588 4775 scope.go:117] "RemoveContainer" containerID="f291d494b8e944fa236a5dc1bf55b8664e50491a176b87cbb20267e1ec08da13" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.380142 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.411759 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.412039 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="350c79c3-b66c-4384-99db-437cba78dcd3" containerName="nova-api-log" containerID="cri-o://fe1e01275823875199399a3ac389cd4adf20983d2d62db98a73751595bce26f4" gracePeriod=30 Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.412201 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="350c79c3-b66c-4384-99db-437cba78dcd3" containerName="nova-api-api" containerID="cri-o://777049e4802dd7b16cc97be573df5ff26fe3c5f656db253adfa39b1303276035" gracePeriod=30 Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.419983 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="350c79c3-b66c-4384-99db-437cba78dcd3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.420093 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="350c79c3-b66c-4384-99db-437cba78dcd3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.454728 4775 scope.go:117] "RemoveContainer" containerID="1298c7bbb8e1491cd68cf1ce0fc4d17637c2cbfbb35bdf63a1502dff091919e7" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.455693 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.458612 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.461635 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.461922 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.462079 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.496058 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.496347 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9f34a902-d86a-49b7-bd28-d47d1896d0e9" containerName="nova-metadata-log" containerID="cri-o://8c3117cbae2f0506b8005a0ec68a9aab8b7833ac524ecf3d2c904d3a1eaa39f2" gracePeriod=30 Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.496440 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9f34a902-d86a-49b7-bd28-d47d1896d0e9" containerName="nova-metadata-metadata" containerID="cri-o://d6ad92d47c32e5c273c39729e5c7947fd4b13cb81e1b6efa014c57f4f7247fd8" gracePeriod=30 Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.499431 4775 scope.go:117] "RemoveContainer" containerID="2b40cd0fc94548764448da9eda628eb441eab64010fde82204c1fd8e4137cba0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.515967 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.535620 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.535821 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="66a41c08-de0e-46c7-ae0c-b56ba2544af5" containerName="nova-scheduler-scheduler" containerID="cri-o://a57cffbc0e696fc424fe650fd406d8400529e0266328256e2fa086f085050576" gracePeriod=30 Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.575771 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.575868 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.575929 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-425zs\" (UniqueName: \"kubernetes.io/projected/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-kube-api-access-425zs\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.575961 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.576037 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-config-data\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.576075 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-run-httpd\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.576103 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-log-httpd\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.576161 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-scripts\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.681148 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-config-data\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.681244 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-run-httpd\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.681302 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-log-httpd\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.681409 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-scripts\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.681518 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.681621 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.681697 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-425zs\" (UniqueName: \"kubernetes.io/projected/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-kube-api-access-425zs\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.681749 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.683943 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-run-httpd\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.687296 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.688133 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-config-data\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.688469 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-log-httpd\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.688865 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.689967 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.690509 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-scripts\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.707077 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s28hs"] Dec 16 15:18:22 crc kubenswrapper[4775]: W1216 15:18:22.714613 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode182acf8_e0f8_4ad4_b91f_0028568a79c3.slice/crio-d41ca4798b7ac0b8304867663e23c6feaf0a53d586fa01ac76aaa6f12cce9564 WatchSource:0}: Error finding container d41ca4798b7ac0b8304867663e23c6feaf0a53d586fa01ac76aaa6f12cce9564: Status 404 returned error can't find the container with id d41ca4798b7ac0b8304867663e23c6feaf0a53d586fa01ac76aaa6f12cce9564 Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.715166 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-425zs\" (UniqueName: \"kubernetes.io/projected/81d44d7a-71b0-40da-b940-ccdb6d63b4f9-kube-api-access-425zs\") pod \"ceilometer-0\" (UID: \"81d44d7a-71b0-40da-b940-ccdb6d63b4f9\") " pod="openstack/ceilometer-0" Dec 16 15:18:22 crc kubenswrapper[4775]: I1216 15:18:22.791767 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:18:23 crc kubenswrapper[4775]: I1216 15:18:23.161132 4775 generic.go:334] "Generic (PLEG): container finished" podID="350c79c3-b66c-4384-99db-437cba78dcd3" containerID="fe1e01275823875199399a3ac389cd4adf20983d2d62db98a73751595bce26f4" exitCode=143 Dec 16 15:18:23 crc kubenswrapper[4775]: I1216 15:18:23.161462 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"350c79c3-b66c-4384-99db-437cba78dcd3","Type":"ContainerDied","Data":"fe1e01275823875199399a3ac389cd4adf20983d2d62db98a73751595bce26f4"} Dec 16 15:18:23 crc kubenswrapper[4775]: I1216 15:18:23.166297 4775 generic.go:334] "Generic (PLEG): container finished" podID="e182acf8-e0f8-4ad4-b91f-0028568a79c3" containerID="31827892482b756cefe5a3a796f3da03cb878eb90a4de1f95fba4ddf980fb4f3" exitCode=0 Dec 16 15:18:23 crc kubenswrapper[4775]: I1216 15:18:23.166396 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s28hs" event={"ID":"e182acf8-e0f8-4ad4-b91f-0028568a79c3","Type":"ContainerDied","Data":"31827892482b756cefe5a3a796f3da03cb878eb90a4de1f95fba4ddf980fb4f3"} Dec 16 15:18:23 crc kubenswrapper[4775]: I1216 15:18:23.166428 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s28hs" event={"ID":"e182acf8-e0f8-4ad4-b91f-0028568a79c3","Type":"ContainerStarted","Data":"d41ca4798b7ac0b8304867663e23c6feaf0a53d586fa01ac76aaa6f12cce9564"} Dec 16 15:18:23 crc kubenswrapper[4775]: I1216 15:18:23.179186 4775 generic.go:334] "Generic (PLEG): container finished" podID="9f34a902-d86a-49b7-bd28-d47d1896d0e9" containerID="8c3117cbae2f0506b8005a0ec68a9aab8b7833ac524ecf3d2c904d3a1eaa39f2" exitCode=143 Dec 16 15:18:23 crc kubenswrapper[4775]: I1216 15:18:23.179285 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f34a902-d86a-49b7-bd28-d47d1896d0e9","Type":"ContainerDied","Data":"8c3117cbae2f0506b8005a0ec68a9aab8b7833ac524ecf3d2c904d3a1eaa39f2"} Dec 16 15:18:23 crc kubenswrapper[4775]: I1216 15:18:23.328615 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:18:23 crc kubenswrapper[4775]: W1216 15:18:23.332552 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81d44d7a_71b0_40da_b940_ccdb6d63b4f9.slice/crio-6ae7fd6a49ef04783fd2c746d91afa6c84adc3c45dd3d832f6d8418e40546ca3 WatchSource:0}: Error finding container 6ae7fd6a49ef04783fd2c746d91afa6c84adc3c45dd3d832f6d8418e40546ca3: Status 404 returned error can't find the container with id 6ae7fd6a49ef04783fd2c746d91afa6c84adc3c45dd3d832f6d8418e40546ca3 Dec 16 15:18:23 crc kubenswrapper[4775]: E1216 15:18:23.345771 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a57cffbc0e696fc424fe650fd406d8400529e0266328256e2fa086f085050576" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 15:18:23 crc kubenswrapper[4775]: E1216 15:18:23.348348 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a57cffbc0e696fc424fe650fd406d8400529e0266328256e2fa086f085050576" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 15:18:23 crc kubenswrapper[4775]: I1216 15:18:23.349065 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dea2122b-dae9-4775-ab5c-b10760260231" path="/var/lib/kubelet/pods/dea2122b-dae9-4775-ab5c-b10760260231/volumes" Dec 16 15:18:23 crc kubenswrapper[4775]: E1216 15:18:23.349949 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a57cffbc0e696fc424fe650fd406d8400529e0266328256e2fa086f085050576" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 15:18:23 crc kubenswrapper[4775]: E1216 15:18:23.350005 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="66a41c08-de0e-46c7-ae0c-b56ba2544af5" containerName="nova-scheduler-scheduler" Dec 16 15:18:24 crc kubenswrapper[4775]: I1216 15:18:24.198748 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81d44d7a-71b0-40da-b940-ccdb6d63b4f9","Type":"ContainerStarted","Data":"400ac35fbdd717f42c4fb23da030669e077fe71fe08e4f9d405657f6be1e1ec6"} Dec 16 15:18:24 crc kubenswrapper[4775]: I1216 15:18:24.199177 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81d44d7a-71b0-40da-b940-ccdb6d63b4f9","Type":"ContainerStarted","Data":"6ae7fd6a49ef04783fd2c746d91afa6c84adc3c45dd3d832f6d8418e40546ca3"} Dec 16 15:18:25 crc kubenswrapper[4775]: I1216 15:18:25.209241 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81d44d7a-71b0-40da-b940-ccdb6d63b4f9","Type":"ContainerStarted","Data":"ab5f9894843c79ef10b2702fc778c7dec23f6bf60dae4070a5cbd24501653f32"} Dec 16 15:18:25 crc kubenswrapper[4775]: I1216 15:18:25.654472 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="9f34a902-d86a-49b7-bd28-d47d1896d0e9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:36870->10.217.0.197:8775: read: connection reset by peer" Dec 16 15:18:25 crc kubenswrapper[4775]: I1216 15:18:25.654541 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="9f34a902-d86a-49b7-bd28-d47d1896d0e9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:36856->10.217.0.197:8775: read: connection reset by peer" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.117349 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.222558 4775 generic.go:334] "Generic (PLEG): container finished" podID="9f34a902-d86a-49b7-bd28-d47d1896d0e9" containerID="d6ad92d47c32e5c273c39729e5c7947fd4b13cb81e1b6efa014c57f4f7247fd8" exitCode=0 Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.222614 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f34a902-d86a-49b7-bd28-d47d1896d0e9","Type":"ContainerDied","Data":"d6ad92d47c32e5c273c39729e5c7947fd4b13cb81e1b6efa014c57f4f7247fd8"} Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.222640 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f34a902-d86a-49b7-bd28-d47d1896d0e9","Type":"ContainerDied","Data":"fcd2b21b3b0b53bfb181e2ee348c9fd91d554efe9c52e4b048d021c43eae137a"} Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.222656 4775 scope.go:117] "RemoveContainer" containerID="d6ad92d47c32e5c273c39729e5c7947fd4b13cb81e1b6efa014c57f4f7247fd8" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.222767 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.228518 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81d44d7a-71b0-40da-b940-ccdb6d63b4f9","Type":"ContainerStarted","Data":"95e5264085637a6c0778c845c05cc51f3b8a836df51f9fc00cedc8ca20a1d854"} Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.253054 4775 scope.go:117] "RemoveContainer" containerID="8c3117cbae2f0506b8005a0ec68a9aab8b7833ac524ecf3d2c904d3a1eaa39f2" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.254754 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-624lb\" (UniqueName: \"kubernetes.io/projected/9f34a902-d86a-49b7-bd28-d47d1896d0e9-kube-api-access-624lb\") pod \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.254827 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f34a902-d86a-49b7-bd28-d47d1896d0e9-nova-metadata-tls-certs\") pod \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.254967 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f34a902-d86a-49b7-bd28-d47d1896d0e9-logs\") pod \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.255116 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f34a902-d86a-49b7-bd28-d47d1896d0e9-config-data\") pod \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.255164 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f34a902-d86a-49b7-bd28-d47d1896d0e9-combined-ca-bundle\") pod \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\" (UID: \"9f34a902-d86a-49b7-bd28-d47d1896d0e9\") " Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.255446 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f34a902-d86a-49b7-bd28-d47d1896d0e9-logs" (OuterVolumeSpecName: "logs") pod "9f34a902-d86a-49b7-bd28-d47d1896d0e9" (UID: "9f34a902-d86a-49b7-bd28-d47d1896d0e9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.256234 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f34a902-d86a-49b7-bd28-d47d1896d0e9-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.259929 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f34a902-d86a-49b7-bd28-d47d1896d0e9-kube-api-access-624lb" (OuterVolumeSpecName: "kube-api-access-624lb") pod "9f34a902-d86a-49b7-bd28-d47d1896d0e9" (UID: "9f34a902-d86a-49b7-bd28-d47d1896d0e9"). InnerVolumeSpecName "kube-api-access-624lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.272847 4775 scope.go:117] "RemoveContainer" containerID="d6ad92d47c32e5c273c39729e5c7947fd4b13cb81e1b6efa014c57f4f7247fd8" Dec 16 15:18:26 crc kubenswrapper[4775]: E1216 15:18:26.273368 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6ad92d47c32e5c273c39729e5c7947fd4b13cb81e1b6efa014c57f4f7247fd8\": container with ID starting with d6ad92d47c32e5c273c39729e5c7947fd4b13cb81e1b6efa014c57f4f7247fd8 not found: ID does not exist" containerID="d6ad92d47c32e5c273c39729e5c7947fd4b13cb81e1b6efa014c57f4f7247fd8" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.273395 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6ad92d47c32e5c273c39729e5c7947fd4b13cb81e1b6efa014c57f4f7247fd8"} err="failed to get container status \"d6ad92d47c32e5c273c39729e5c7947fd4b13cb81e1b6efa014c57f4f7247fd8\": rpc error: code = NotFound desc = could not find container \"d6ad92d47c32e5c273c39729e5c7947fd4b13cb81e1b6efa014c57f4f7247fd8\": container with ID starting with d6ad92d47c32e5c273c39729e5c7947fd4b13cb81e1b6efa014c57f4f7247fd8 not found: ID does not exist" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.273416 4775 scope.go:117] "RemoveContainer" containerID="8c3117cbae2f0506b8005a0ec68a9aab8b7833ac524ecf3d2c904d3a1eaa39f2" Dec 16 15:18:26 crc kubenswrapper[4775]: E1216 15:18:26.273780 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c3117cbae2f0506b8005a0ec68a9aab8b7833ac524ecf3d2c904d3a1eaa39f2\": container with ID starting with 8c3117cbae2f0506b8005a0ec68a9aab8b7833ac524ecf3d2c904d3a1eaa39f2 not found: ID does not exist" containerID="8c3117cbae2f0506b8005a0ec68a9aab8b7833ac524ecf3d2c904d3a1eaa39f2" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.273796 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c3117cbae2f0506b8005a0ec68a9aab8b7833ac524ecf3d2c904d3a1eaa39f2"} err="failed to get container status \"8c3117cbae2f0506b8005a0ec68a9aab8b7833ac524ecf3d2c904d3a1eaa39f2\": rpc error: code = NotFound desc = could not find container \"8c3117cbae2f0506b8005a0ec68a9aab8b7833ac524ecf3d2c904d3a1eaa39f2\": container with ID starting with 8c3117cbae2f0506b8005a0ec68a9aab8b7833ac524ecf3d2c904d3a1eaa39f2 not found: ID does not exist" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.284746 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f34a902-d86a-49b7-bd28-d47d1896d0e9-config-data" (OuterVolumeSpecName: "config-data") pod "9f34a902-d86a-49b7-bd28-d47d1896d0e9" (UID: "9f34a902-d86a-49b7-bd28-d47d1896d0e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.288977 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f34a902-d86a-49b7-bd28-d47d1896d0e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f34a902-d86a-49b7-bd28-d47d1896d0e9" (UID: "9f34a902-d86a-49b7-bd28-d47d1896d0e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.315996 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f34a902-d86a-49b7-bd28-d47d1896d0e9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9f34a902-d86a-49b7-bd28-d47d1896d0e9" (UID: "9f34a902-d86a-49b7-bd28-d47d1896d0e9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.359395 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f34a902-d86a-49b7-bd28-d47d1896d0e9-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.359432 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f34a902-d86a-49b7-bd28-d47d1896d0e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.359444 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-624lb\" (UniqueName: \"kubernetes.io/projected/9f34a902-d86a-49b7-bd28-d47d1896d0e9-kube-api-access-624lb\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.359453 4775 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f34a902-d86a-49b7-bd28-d47d1896d0e9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.556466 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.573960 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.583087 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:18:26 crc kubenswrapper[4775]: E1216 15:18:26.583645 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f34a902-d86a-49b7-bd28-d47d1896d0e9" containerName="nova-metadata-metadata" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.583665 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f34a902-d86a-49b7-bd28-d47d1896d0e9" containerName="nova-metadata-metadata" Dec 16 15:18:26 crc kubenswrapper[4775]: E1216 15:18:26.583696 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f34a902-d86a-49b7-bd28-d47d1896d0e9" containerName="nova-metadata-log" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.583705 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f34a902-d86a-49b7-bd28-d47d1896d0e9" containerName="nova-metadata-log" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.583902 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f34a902-d86a-49b7-bd28-d47d1896d0e9" containerName="nova-metadata-log" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.583928 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f34a902-d86a-49b7-bd28-d47d1896d0e9" containerName="nova-metadata-metadata" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.584968 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.587437 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.587652 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.594691 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.666368 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwmbf\" (UniqueName: \"kubernetes.io/projected/9e94b8ac-6213-42f6-94ff-7e42e358fcf9-kube-api-access-xwmbf\") pod \"nova-metadata-0\" (UID: \"9e94b8ac-6213-42f6-94ff-7e42e358fcf9\") " pod="openstack/nova-metadata-0" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.667150 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e94b8ac-6213-42f6-94ff-7e42e358fcf9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9e94b8ac-6213-42f6-94ff-7e42e358fcf9\") " pod="openstack/nova-metadata-0" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.667294 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e94b8ac-6213-42f6-94ff-7e42e358fcf9-logs\") pod \"nova-metadata-0\" (UID: \"9e94b8ac-6213-42f6-94ff-7e42e358fcf9\") " pod="openstack/nova-metadata-0" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.667333 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e94b8ac-6213-42f6-94ff-7e42e358fcf9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9e94b8ac-6213-42f6-94ff-7e42e358fcf9\") " pod="openstack/nova-metadata-0" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.667369 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e94b8ac-6213-42f6-94ff-7e42e358fcf9-config-data\") pod \"nova-metadata-0\" (UID: \"9e94b8ac-6213-42f6-94ff-7e42e358fcf9\") " pod="openstack/nova-metadata-0" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.769287 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e94b8ac-6213-42f6-94ff-7e42e358fcf9-config-data\") pod \"nova-metadata-0\" (UID: \"9e94b8ac-6213-42f6-94ff-7e42e358fcf9\") " pod="openstack/nova-metadata-0" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.769361 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwmbf\" (UniqueName: \"kubernetes.io/projected/9e94b8ac-6213-42f6-94ff-7e42e358fcf9-kube-api-access-xwmbf\") pod \"nova-metadata-0\" (UID: \"9e94b8ac-6213-42f6-94ff-7e42e358fcf9\") " pod="openstack/nova-metadata-0" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.769478 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e94b8ac-6213-42f6-94ff-7e42e358fcf9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9e94b8ac-6213-42f6-94ff-7e42e358fcf9\") " pod="openstack/nova-metadata-0" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.769522 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e94b8ac-6213-42f6-94ff-7e42e358fcf9-logs\") pod \"nova-metadata-0\" (UID: \"9e94b8ac-6213-42f6-94ff-7e42e358fcf9\") " pod="openstack/nova-metadata-0" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.769545 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e94b8ac-6213-42f6-94ff-7e42e358fcf9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9e94b8ac-6213-42f6-94ff-7e42e358fcf9\") " pod="openstack/nova-metadata-0" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.770268 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e94b8ac-6213-42f6-94ff-7e42e358fcf9-logs\") pod \"nova-metadata-0\" (UID: \"9e94b8ac-6213-42f6-94ff-7e42e358fcf9\") " pod="openstack/nova-metadata-0" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.774432 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e94b8ac-6213-42f6-94ff-7e42e358fcf9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9e94b8ac-6213-42f6-94ff-7e42e358fcf9\") " pod="openstack/nova-metadata-0" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.775717 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e94b8ac-6213-42f6-94ff-7e42e358fcf9-config-data\") pod \"nova-metadata-0\" (UID: \"9e94b8ac-6213-42f6-94ff-7e42e358fcf9\") " pod="openstack/nova-metadata-0" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.776476 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e94b8ac-6213-42f6-94ff-7e42e358fcf9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9e94b8ac-6213-42f6-94ff-7e42e358fcf9\") " pod="openstack/nova-metadata-0" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.788670 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwmbf\" (UniqueName: \"kubernetes.io/projected/9e94b8ac-6213-42f6-94ff-7e42e358fcf9-kube-api-access-xwmbf\") pod \"nova-metadata-0\" (UID: \"9e94b8ac-6213-42f6-94ff-7e42e358fcf9\") " pod="openstack/nova-metadata-0" Dec 16 15:18:26 crc kubenswrapper[4775]: I1216 15:18:26.917872 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:18:27 crc kubenswrapper[4775]: I1216 15:18:27.252369 4775 generic.go:334] "Generic (PLEG): container finished" podID="66a41c08-de0e-46c7-ae0c-b56ba2544af5" containerID="a57cffbc0e696fc424fe650fd406d8400529e0266328256e2fa086f085050576" exitCode=0 Dec 16 15:18:27 crc kubenswrapper[4775]: I1216 15:18:27.252734 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"66a41c08-de0e-46c7-ae0c-b56ba2544af5","Type":"ContainerDied","Data":"a57cffbc0e696fc424fe650fd406d8400529e0266328256e2fa086f085050576"} Dec 16 15:18:27 crc kubenswrapper[4775]: I1216 15:18:27.259168 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81d44d7a-71b0-40da-b940-ccdb6d63b4f9","Type":"ContainerStarted","Data":"d23c700caef81b0408bf5ea2e9cc5a0ee4660a234009d000868c27ccbc5e83a1"} Dec 16 15:18:27 crc kubenswrapper[4775]: I1216 15:18:27.260785 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 15:18:27 crc kubenswrapper[4775]: I1216 15:18:27.317613 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:18:27 crc kubenswrapper[4775]: I1216 15:18:27.343688 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.6948255049999998 podStartE2EDuration="5.343670685s" podCreationTimestamp="2025-12-16 15:18:22 +0000 UTC" firstStartedPulling="2025-12-16 15:18:23.335246331 +0000 UTC m=+1428.286325244" lastFinishedPulling="2025-12-16 15:18:26.984091501 +0000 UTC m=+1431.935170424" observedRunningTime="2025-12-16 15:18:27.300444522 +0000 UTC m=+1432.251523455" watchObservedRunningTime="2025-12-16 15:18:27.343670685 +0000 UTC m=+1432.294749608" Dec 16 15:18:27 crc kubenswrapper[4775]: I1216 15:18:27.372652 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f34a902-d86a-49b7-bd28-d47d1896d0e9" path="/var/lib/kubelet/pods/9f34a902-d86a-49b7-bd28-d47d1896d0e9/volumes" Dec 16 15:18:27 crc kubenswrapper[4775]: I1216 15:18:27.400812 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a41c08-de0e-46c7-ae0c-b56ba2544af5-combined-ca-bundle\") pod \"66a41c08-de0e-46c7-ae0c-b56ba2544af5\" (UID: \"66a41c08-de0e-46c7-ae0c-b56ba2544af5\") " Dec 16 15:18:27 crc kubenswrapper[4775]: I1216 15:18:27.400960 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66a41c08-de0e-46c7-ae0c-b56ba2544af5-config-data\") pod \"66a41c08-de0e-46c7-ae0c-b56ba2544af5\" (UID: \"66a41c08-de0e-46c7-ae0c-b56ba2544af5\") " Dec 16 15:18:27 crc kubenswrapper[4775]: I1216 15:18:27.401002 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwx7z\" (UniqueName: \"kubernetes.io/projected/66a41c08-de0e-46c7-ae0c-b56ba2544af5-kube-api-access-jwx7z\") pod \"66a41c08-de0e-46c7-ae0c-b56ba2544af5\" (UID: \"66a41c08-de0e-46c7-ae0c-b56ba2544af5\") " Dec 16 15:18:27 crc kubenswrapper[4775]: I1216 15:18:27.409608 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66a41c08-de0e-46c7-ae0c-b56ba2544af5-kube-api-access-jwx7z" (OuterVolumeSpecName: "kube-api-access-jwx7z") pod "66a41c08-de0e-46c7-ae0c-b56ba2544af5" (UID: "66a41c08-de0e-46c7-ae0c-b56ba2544af5"). InnerVolumeSpecName "kube-api-access-jwx7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:27 crc kubenswrapper[4775]: I1216 15:18:27.442417 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66a41c08-de0e-46c7-ae0c-b56ba2544af5-config-data" (OuterVolumeSpecName: "config-data") pod "66a41c08-de0e-46c7-ae0c-b56ba2544af5" (UID: "66a41c08-de0e-46c7-ae0c-b56ba2544af5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:27 crc kubenswrapper[4775]: I1216 15:18:27.458311 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66a41c08-de0e-46c7-ae0c-b56ba2544af5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66a41c08-de0e-46c7-ae0c-b56ba2544af5" (UID: "66a41c08-de0e-46c7-ae0c-b56ba2544af5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:27 crc kubenswrapper[4775]: I1216 15:18:27.504051 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a41c08-de0e-46c7-ae0c-b56ba2544af5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:27 crc kubenswrapper[4775]: I1216 15:18:27.504090 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66a41c08-de0e-46c7-ae0c-b56ba2544af5-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:27 crc kubenswrapper[4775]: I1216 15:18:27.504099 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwx7z\" (UniqueName: \"kubernetes.io/projected/66a41c08-de0e-46c7-ae0c-b56ba2544af5-kube-api-access-jwx7z\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:27 crc kubenswrapper[4775]: I1216 15:18:27.512256 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:18:27 crc kubenswrapper[4775]: W1216 15:18:27.513620 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e94b8ac_6213_42f6_94ff_7e42e358fcf9.slice/crio-bcf5e68e648ac65d0e9f436d4141e306fee9fa6d6d3d4cbb8f3618c71cef93cf WatchSource:0}: Error finding container bcf5e68e648ac65d0e9f436d4141e306fee9fa6d6d3d4cbb8f3618c71cef93cf: Status 404 returned error can't find the container with id bcf5e68e648ac65d0e9f436d4141e306fee9fa6d6d3d4cbb8f3618c71cef93cf Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.181465 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.271780 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.271813 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"66a41c08-de0e-46c7-ae0c-b56ba2544af5","Type":"ContainerDied","Data":"e2ff7cda78fbcc0cd39d4966e8d4a8170f2de2952d5011e37805ebd7b5dc7ea7"} Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.271904 4775 scope.go:117] "RemoveContainer" containerID="a57cffbc0e696fc424fe650fd406d8400529e0266328256e2fa086f085050576" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.276065 4775 generic.go:334] "Generic (PLEG): container finished" podID="350c79c3-b66c-4384-99db-437cba78dcd3" containerID="777049e4802dd7b16cc97be573df5ff26fe3c5f656db253adfa39b1303276035" exitCode=0 Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.276090 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.276136 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"350c79c3-b66c-4384-99db-437cba78dcd3","Type":"ContainerDied","Data":"777049e4802dd7b16cc97be573df5ff26fe3c5f656db253adfa39b1303276035"} Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.276176 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"350c79c3-b66c-4384-99db-437cba78dcd3","Type":"ContainerDied","Data":"6a0ca071fc42570ff50802a261b1a2a520de8a25ecc8345904a44a534f2603b6"} Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.278508 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e94b8ac-6213-42f6-94ff-7e42e358fcf9","Type":"ContainerStarted","Data":"35527708e49e3378687a764a79cb7602422976cac0b0019d9d8425eb9ef3cf1d"} Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.278540 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e94b8ac-6213-42f6-94ff-7e42e358fcf9","Type":"ContainerStarted","Data":"8261f22766767ce0088e6f67593b74240b06d3a9c5075cdf9ec57fc13579c88f"} Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.278551 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e94b8ac-6213-42f6-94ff-7e42e358fcf9","Type":"ContainerStarted","Data":"bcf5e68e648ac65d0e9f436d4141e306fee9fa6d6d3d4cbb8f3618c71cef93cf"} Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.303332 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.303312301 podStartE2EDuration="2.303312301s" podCreationTimestamp="2025-12-16 15:18:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:18:28.300099277 +0000 UTC m=+1433.251178190" watchObservedRunningTime="2025-12-16 15:18:28.303312301 +0000 UTC m=+1433.254391224" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.324699 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-combined-ca-bundle\") pod \"350c79c3-b66c-4384-99db-437cba78dcd3\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.324789 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-config-data\") pod \"350c79c3-b66c-4384-99db-437cba78dcd3\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.324825 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kllkl\" (UniqueName: \"kubernetes.io/projected/350c79c3-b66c-4384-99db-437cba78dcd3-kube-api-access-kllkl\") pod \"350c79c3-b66c-4384-99db-437cba78dcd3\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.324918 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-internal-tls-certs\") pod \"350c79c3-b66c-4384-99db-437cba78dcd3\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.324979 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-public-tls-certs\") pod \"350c79c3-b66c-4384-99db-437cba78dcd3\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.325025 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/350c79c3-b66c-4384-99db-437cba78dcd3-logs\") pod \"350c79c3-b66c-4384-99db-437cba78dcd3\" (UID: \"350c79c3-b66c-4384-99db-437cba78dcd3\") " Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.326227 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/350c79c3-b66c-4384-99db-437cba78dcd3-logs" (OuterVolumeSpecName: "logs") pod "350c79c3-b66c-4384-99db-437cba78dcd3" (UID: "350c79c3-b66c-4384-99db-437cba78dcd3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.340162 4775 scope.go:117] "RemoveContainer" containerID="777049e4802dd7b16cc97be573df5ff26fe3c5f656db253adfa39b1303276035" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.340469 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/350c79c3-b66c-4384-99db-437cba78dcd3-kube-api-access-kllkl" (OuterVolumeSpecName: "kube-api-access-kllkl") pod "350c79c3-b66c-4384-99db-437cba78dcd3" (UID: "350c79c3-b66c-4384-99db-437cba78dcd3"). InnerVolumeSpecName "kube-api-access-kllkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.346864 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.366512 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-config-data" (OuterVolumeSpecName: "config-data") pod "350c79c3-b66c-4384-99db-437cba78dcd3" (UID: "350c79c3-b66c-4384-99db-437cba78dcd3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.373042 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.380969 4775 scope.go:117] "RemoveContainer" containerID="fe1e01275823875199399a3ac389cd4adf20983d2d62db98a73751595bce26f4" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.385740 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:18:28 crc kubenswrapper[4775]: E1216 15:18:28.386409 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66a41c08-de0e-46c7-ae0c-b56ba2544af5" containerName="nova-scheduler-scheduler" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.386439 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a41c08-de0e-46c7-ae0c-b56ba2544af5" containerName="nova-scheduler-scheduler" Dec 16 15:18:28 crc kubenswrapper[4775]: E1216 15:18:28.386468 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="350c79c3-b66c-4384-99db-437cba78dcd3" containerName="nova-api-log" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.386479 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="350c79c3-b66c-4384-99db-437cba78dcd3" containerName="nova-api-log" Dec 16 15:18:28 crc kubenswrapper[4775]: E1216 15:18:28.386510 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="350c79c3-b66c-4384-99db-437cba78dcd3" containerName="nova-api-api" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.386520 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="350c79c3-b66c-4384-99db-437cba78dcd3" containerName="nova-api-api" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.386757 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="350c79c3-b66c-4384-99db-437cba78dcd3" containerName="nova-api-log" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.386789 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="66a41c08-de0e-46c7-ae0c-b56ba2544af5" containerName="nova-scheduler-scheduler" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.386802 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="350c79c3-b66c-4384-99db-437cba78dcd3" containerName="nova-api-api" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.387994 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.396364 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.410089 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "350c79c3-b66c-4384-99db-437cba78dcd3" (UID: "350c79c3-b66c-4384-99db-437cba78dcd3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.415008 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.422702 4775 scope.go:117] "RemoveContainer" containerID="777049e4802dd7b16cc97be573df5ff26fe3c5f656db253adfa39b1303276035" Dec 16 15:18:28 crc kubenswrapper[4775]: E1216 15:18:28.423386 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"777049e4802dd7b16cc97be573df5ff26fe3c5f656db253adfa39b1303276035\": container with ID starting with 777049e4802dd7b16cc97be573df5ff26fe3c5f656db253adfa39b1303276035 not found: ID does not exist" containerID="777049e4802dd7b16cc97be573df5ff26fe3c5f656db253adfa39b1303276035" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.423424 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777049e4802dd7b16cc97be573df5ff26fe3c5f656db253adfa39b1303276035"} err="failed to get container status \"777049e4802dd7b16cc97be573df5ff26fe3c5f656db253adfa39b1303276035\": rpc error: code = NotFound desc = could not find container \"777049e4802dd7b16cc97be573df5ff26fe3c5f656db253adfa39b1303276035\": container with ID starting with 777049e4802dd7b16cc97be573df5ff26fe3c5f656db253adfa39b1303276035 not found: ID does not exist" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.423760 4775 scope.go:117] "RemoveContainer" containerID="fe1e01275823875199399a3ac389cd4adf20983d2d62db98a73751595bce26f4" Dec 16 15:18:28 crc kubenswrapper[4775]: E1216 15:18:28.424152 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe1e01275823875199399a3ac389cd4adf20983d2d62db98a73751595bce26f4\": container with ID starting with fe1e01275823875199399a3ac389cd4adf20983d2d62db98a73751595bce26f4 not found: ID does not exist" containerID="fe1e01275823875199399a3ac389cd4adf20983d2d62db98a73751595bce26f4" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.424210 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe1e01275823875199399a3ac389cd4adf20983d2d62db98a73751595bce26f4"} err="failed to get container status \"fe1e01275823875199399a3ac389cd4adf20983d2d62db98a73751595bce26f4\": rpc error: code = NotFound desc = could not find container \"fe1e01275823875199399a3ac389cd4adf20983d2d62db98a73751595bce26f4\": container with ID starting with fe1e01275823875199399a3ac389cd4adf20983d2d62db98a73751595bce26f4 not found: ID does not exist" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.427513 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.427539 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.427549 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kllkl\" (UniqueName: \"kubernetes.io/projected/350c79c3-b66c-4384-99db-437cba78dcd3-kube-api-access-kllkl\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.427558 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/350c79c3-b66c-4384-99db-437cba78dcd3-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.429416 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "350c79c3-b66c-4384-99db-437cba78dcd3" (UID: "350c79c3-b66c-4384-99db-437cba78dcd3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.433208 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "350c79c3-b66c-4384-99db-437cba78dcd3" (UID: "350c79c3-b66c-4384-99db-437cba78dcd3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.529844 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg9d6\" (UniqueName: \"kubernetes.io/projected/f991d67b-2c42-4f93-aacb-3486ea1e43a8-kube-api-access-rg9d6\") pod \"nova-scheduler-0\" (UID: \"f991d67b-2c42-4f93-aacb-3486ea1e43a8\") " pod="openstack/nova-scheduler-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.529945 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f991d67b-2c42-4f93-aacb-3486ea1e43a8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f991d67b-2c42-4f93-aacb-3486ea1e43a8\") " pod="openstack/nova-scheduler-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.529977 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f991d67b-2c42-4f93-aacb-3486ea1e43a8-config-data\") pod \"nova-scheduler-0\" (UID: \"f991d67b-2c42-4f93-aacb-3486ea1e43a8\") " pod="openstack/nova-scheduler-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.530412 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.530452 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/350c79c3-b66c-4384-99db-437cba78dcd3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.612972 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.632204 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg9d6\" (UniqueName: \"kubernetes.io/projected/f991d67b-2c42-4f93-aacb-3486ea1e43a8-kube-api-access-rg9d6\") pod \"nova-scheduler-0\" (UID: \"f991d67b-2c42-4f93-aacb-3486ea1e43a8\") " pod="openstack/nova-scheduler-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.632270 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f991d67b-2c42-4f93-aacb-3486ea1e43a8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f991d67b-2c42-4f93-aacb-3486ea1e43a8\") " pod="openstack/nova-scheduler-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.632292 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f991d67b-2c42-4f93-aacb-3486ea1e43a8-config-data\") pod \"nova-scheduler-0\" (UID: \"f991d67b-2c42-4f93-aacb-3486ea1e43a8\") " pod="openstack/nova-scheduler-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.636213 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f991d67b-2c42-4f93-aacb-3486ea1e43a8-config-data\") pod \"nova-scheduler-0\" (UID: \"f991d67b-2c42-4f93-aacb-3486ea1e43a8\") " pod="openstack/nova-scheduler-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.639329 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f991d67b-2c42-4f93-aacb-3486ea1e43a8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f991d67b-2c42-4f93-aacb-3486ea1e43a8\") " pod="openstack/nova-scheduler-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.658492 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg9d6\" (UniqueName: \"kubernetes.io/projected/f991d67b-2c42-4f93-aacb-3486ea1e43a8-kube-api-access-rg9d6\") pod \"nova-scheduler-0\" (UID: \"f991d67b-2c42-4f93-aacb-3486ea1e43a8\") " pod="openstack/nova-scheduler-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.658559 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.665431 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.667406 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.670827 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.671021 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.671143 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.676035 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.713934 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.734732 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22a00983-b0df-4afb-bbc2-2f7da7c8c05e-config-data\") pod \"nova-api-0\" (UID: \"22a00983-b0df-4afb-bbc2-2f7da7c8c05e\") " pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.734842 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zxfr\" (UniqueName: \"kubernetes.io/projected/22a00983-b0df-4afb-bbc2-2f7da7c8c05e-kube-api-access-5zxfr\") pod \"nova-api-0\" (UID: \"22a00983-b0df-4afb-bbc2-2f7da7c8c05e\") " pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.735075 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22a00983-b0df-4afb-bbc2-2f7da7c8c05e-public-tls-certs\") pod \"nova-api-0\" (UID: \"22a00983-b0df-4afb-bbc2-2f7da7c8c05e\") " pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.735168 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22a00983-b0df-4afb-bbc2-2f7da7c8c05e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"22a00983-b0df-4afb-bbc2-2f7da7c8c05e\") " pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.735216 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22a00983-b0df-4afb-bbc2-2f7da7c8c05e-logs\") pod \"nova-api-0\" (UID: \"22a00983-b0df-4afb-bbc2-2f7da7c8c05e\") " pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.735450 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22a00983-b0df-4afb-bbc2-2f7da7c8c05e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"22a00983-b0df-4afb-bbc2-2f7da7c8c05e\") " pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.837640 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22a00983-b0df-4afb-bbc2-2f7da7c8c05e-public-tls-certs\") pod \"nova-api-0\" (UID: \"22a00983-b0df-4afb-bbc2-2f7da7c8c05e\") " pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.837718 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22a00983-b0df-4afb-bbc2-2f7da7c8c05e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"22a00983-b0df-4afb-bbc2-2f7da7c8c05e\") " pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.837746 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22a00983-b0df-4afb-bbc2-2f7da7c8c05e-logs\") pod \"nova-api-0\" (UID: \"22a00983-b0df-4afb-bbc2-2f7da7c8c05e\") " pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.837870 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22a00983-b0df-4afb-bbc2-2f7da7c8c05e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"22a00983-b0df-4afb-bbc2-2f7da7c8c05e\") " pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.837952 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22a00983-b0df-4afb-bbc2-2f7da7c8c05e-config-data\") pod \"nova-api-0\" (UID: \"22a00983-b0df-4afb-bbc2-2f7da7c8c05e\") " pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.837997 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zxfr\" (UniqueName: \"kubernetes.io/projected/22a00983-b0df-4afb-bbc2-2f7da7c8c05e-kube-api-access-5zxfr\") pod \"nova-api-0\" (UID: \"22a00983-b0df-4afb-bbc2-2f7da7c8c05e\") " pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.838712 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22a00983-b0df-4afb-bbc2-2f7da7c8c05e-logs\") pod \"nova-api-0\" (UID: \"22a00983-b0df-4afb-bbc2-2f7da7c8c05e\") " pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.841589 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22a00983-b0df-4afb-bbc2-2f7da7c8c05e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"22a00983-b0df-4afb-bbc2-2f7da7c8c05e\") " pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.842504 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22a00983-b0df-4afb-bbc2-2f7da7c8c05e-config-data\") pod \"nova-api-0\" (UID: \"22a00983-b0df-4afb-bbc2-2f7da7c8c05e\") " pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.843119 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22a00983-b0df-4afb-bbc2-2f7da7c8c05e-public-tls-certs\") pod \"nova-api-0\" (UID: \"22a00983-b0df-4afb-bbc2-2f7da7c8c05e\") " pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.846765 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22a00983-b0df-4afb-bbc2-2f7da7c8c05e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"22a00983-b0df-4afb-bbc2-2f7da7c8c05e\") " pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.855327 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zxfr\" (UniqueName: \"kubernetes.io/projected/22a00983-b0df-4afb-bbc2-2f7da7c8c05e-kube-api-access-5zxfr\") pod \"nova-api-0\" (UID: \"22a00983-b0df-4afb-bbc2-2f7da7c8c05e\") " pod="openstack/nova-api-0" Dec 16 15:18:28 crc kubenswrapper[4775]: I1216 15:18:28.990483 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:18:29 crc kubenswrapper[4775]: I1216 15:18:29.349481 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="350c79c3-b66c-4384-99db-437cba78dcd3" path="/var/lib/kubelet/pods/350c79c3-b66c-4384-99db-437cba78dcd3/volumes" Dec 16 15:18:29 crc kubenswrapper[4775]: I1216 15:18:29.350262 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66a41c08-de0e-46c7-ae0c-b56ba2544af5" path="/var/lib/kubelet/pods/66a41c08-de0e-46c7-ae0c-b56ba2544af5/volumes" Dec 16 15:18:31 crc kubenswrapper[4775]: I1216 15:18:31.918594 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 15:18:31 crc kubenswrapper[4775]: I1216 15:18:31.919100 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 15:18:32 crc kubenswrapper[4775]: I1216 15:18:32.324956 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s28hs" event={"ID":"e182acf8-e0f8-4ad4-b91f-0028568a79c3","Type":"ContainerStarted","Data":"99264496d2ae041242cdd6b8f64319b97407868f431d6f707ce19bdb27495f68"} Dec 16 15:18:32 crc kubenswrapper[4775]: I1216 15:18:32.489131 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:18:32 crc kubenswrapper[4775]: W1216 15:18:32.490598 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf991d67b_2c42_4f93_aacb_3486ea1e43a8.slice/crio-8e41ee5e294c8802f3a2a7ae05c4f125fd5f646f863d055b981529073e7a44e5 WatchSource:0}: Error finding container 8e41ee5e294c8802f3a2a7ae05c4f125fd5f646f863d055b981529073e7a44e5: Status 404 returned error can't find the container with id 8e41ee5e294c8802f3a2a7ae05c4f125fd5f646f863d055b981529073e7a44e5 Dec 16 15:18:32 crc kubenswrapper[4775]: I1216 15:18:32.615959 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:18:32 crc kubenswrapper[4775]: I1216 15:18:32.869232 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:18:32 crc kubenswrapper[4775]: I1216 15:18:32.869597 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:18:33 crc kubenswrapper[4775]: I1216 15:18:33.335377 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f991d67b-2c42-4f93-aacb-3486ea1e43a8","Type":"ContainerStarted","Data":"c6121904adafe2ad86c7e720abbe6fe618b4df582746e0a959f806f350536715"} Dec 16 15:18:33 crc kubenswrapper[4775]: I1216 15:18:33.335434 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f991d67b-2c42-4f93-aacb-3486ea1e43a8","Type":"ContainerStarted","Data":"8e41ee5e294c8802f3a2a7ae05c4f125fd5f646f863d055b981529073e7a44e5"} Dec 16 15:18:33 crc kubenswrapper[4775]: I1216 15:18:33.341473 4775 generic.go:334] "Generic (PLEG): container finished" podID="e182acf8-e0f8-4ad4-b91f-0028568a79c3" containerID="99264496d2ae041242cdd6b8f64319b97407868f431d6f707ce19bdb27495f68" exitCode=0 Dec 16 15:18:33 crc kubenswrapper[4775]: I1216 15:18:33.360961 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22a00983-b0df-4afb-bbc2-2f7da7c8c05e","Type":"ContainerStarted","Data":"612520cac271ff5dbda2c62a0626f46e9c632165b0962e7c1b02fa2e211dd589"} Dec 16 15:18:33 crc kubenswrapper[4775]: I1216 15:18:33.361015 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22a00983-b0df-4afb-bbc2-2f7da7c8c05e","Type":"ContainerStarted","Data":"a8bc819b954a539a12ff4642362e376f9411a95940aa1b03e37dd2e13fc16631"} Dec 16 15:18:33 crc kubenswrapper[4775]: I1216 15:18:33.361027 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s28hs" event={"ID":"e182acf8-e0f8-4ad4-b91f-0028568a79c3","Type":"ContainerDied","Data":"99264496d2ae041242cdd6b8f64319b97407868f431d6f707ce19bdb27495f68"} Dec 16 15:18:33 crc kubenswrapper[4775]: I1216 15:18:33.384471 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=5.384446913 podStartE2EDuration="5.384446913s" podCreationTimestamp="2025-12-16 15:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:18:33.356845794 +0000 UTC m=+1438.307924717" watchObservedRunningTime="2025-12-16 15:18:33.384446913 +0000 UTC m=+1438.335525836" Dec 16 15:18:33 crc kubenswrapper[4775]: I1216 15:18:33.714234 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 15:18:35 crc kubenswrapper[4775]: I1216 15:18:35.371833 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22a00983-b0df-4afb-bbc2-2f7da7c8c05e","Type":"ContainerStarted","Data":"9c622e52239a2bb481c9c44045873dc76288ca8d32aa8db90e74a0cdfaf09e10"} Dec 16 15:18:36 crc kubenswrapper[4775]: I1216 15:18:36.919081 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 15:18:36 crc kubenswrapper[4775]: I1216 15:18:36.919130 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 15:18:37 crc kubenswrapper[4775]: I1216 15:18:37.950034 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9e94b8ac-6213-42f6-94ff-7e42e358fcf9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 15:18:37 crc kubenswrapper[4775]: I1216 15:18:37.950047 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9e94b8ac-6213-42f6-94ff-7e42e358fcf9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 15:18:38 crc kubenswrapper[4775]: I1216 15:18:38.714166 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 15:18:38 crc kubenswrapper[4775]: I1216 15:18:38.745445 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 15:18:39 crc kubenswrapper[4775]: I1216 15:18:39.444323 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=11.444296197 podStartE2EDuration="11.444296197s" podCreationTimestamp="2025-12-16 15:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:18:39.434546923 +0000 UTC m=+1444.385625846" watchObservedRunningTime="2025-12-16 15:18:39.444296197 +0000 UTC m=+1444.395375130" Dec 16 15:18:39 crc kubenswrapper[4775]: I1216 15:18:39.447584 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 15:18:40 crc kubenswrapper[4775]: I1216 15:18:40.422515 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s28hs" event={"ID":"e182acf8-e0f8-4ad4-b91f-0028568a79c3","Type":"ContainerStarted","Data":"18981acf3fa17eee9010aca72b226738ee963b2f96b1d33df035e66a4f117329"} Dec 16 15:18:40 crc kubenswrapper[4775]: I1216 15:18:40.446245 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s28hs" podStartSLOduration=2.9123359620000002 podStartE2EDuration="19.446222405s" podCreationTimestamp="2025-12-16 15:18:21 +0000 UTC" firstStartedPulling="2025-12-16 15:18:23.177437886 +0000 UTC m=+1428.128516809" lastFinishedPulling="2025-12-16 15:18:39.711324319 +0000 UTC m=+1444.662403252" observedRunningTime="2025-12-16 15:18:40.437414541 +0000 UTC m=+1445.388493454" watchObservedRunningTime="2025-12-16 15:18:40.446222405 +0000 UTC m=+1445.397301338" Dec 16 15:18:42 crc kubenswrapper[4775]: I1216 15:18:42.132164 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s28hs" Dec 16 15:18:42 crc kubenswrapper[4775]: I1216 15:18:42.132445 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s28hs" Dec 16 15:18:43 crc kubenswrapper[4775]: I1216 15:18:43.191443 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s28hs" podUID="e182acf8-e0f8-4ad4-b91f-0028568a79c3" containerName="registry-server" probeResult="failure" output=< Dec 16 15:18:43 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Dec 16 15:18:43 crc kubenswrapper[4775]: > Dec 16 15:18:46 crc kubenswrapper[4775]: I1216 15:18:46.924793 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 15:18:46 crc kubenswrapper[4775]: I1216 15:18:46.926292 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 15:18:46 crc kubenswrapper[4775]: I1216 15:18:46.930724 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 15:18:46 crc kubenswrapper[4775]: I1216 15:18:46.931801 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 15:18:49 crc kubenswrapper[4775]: I1216 15:18:49.017281 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 15:18:49 crc kubenswrapper[4775]: I1216 15:18:49.017667 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 15:18:50 crc kubenswrapper[4775]: I1216 15:18:50.032193 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="22a00983-b0df-4afb-bbc2-2f7da7c8c05e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.211:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 15:18:50 crc kubenswrapper[4775]: I1216 15:18:50.032235 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="22a00983-b0df-4afb-bbc2-2f7da7c8c05e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.211:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 15:18:52 crc kubenswrapper[4775]: I1216 15:18:52.212341 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s28hs" Dec 16 15:18:52 crc kubenswrapper[4775]: I1216 15:18:52.276226 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s28hs" Dec 16 15:18:52 crc kubenswrapper[4775]: I1216 15:18:52.803214 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s28hs"] Dec 16 15:18:52 crc kubenswrapper[4775]: I1216 15:18:52.804640 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 16 15:18:52 crc kubenswrapper[4775]: I1216 15:18:52.983138 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vj7n7"] Dec 16 15:18:52 crc kubenswrapper[4775]: I1216 15:18:52.985619 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vj7n7" podUID="f0e02474-416e-4434-b482-2df56ae4c6a7" containerName="registry-server" containerID="cri-o://feb35b471f5dc44215765b93c63ac6cea02aa217582fc92aee78cc5074d69757" gracePeriod=2 Dec 16 15:18:53 crc kubenswrapper[4775]: E1216 15:18:53.294844 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0e02474_416e_4434_b482_2df56ae4c6a7.slice/crio-conmon-feb35b471f5dc44215765b93c63ac6cea02aa217582fc92aee78cc5074d69757.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0e02474_416e_4434_b482_2df56ae4c6a7.slice/crio-feb35b471f5dc44215765b93c63ac6cea02aa217582fc92aee78cc5074d69757.scope\": RecentStats: unable to find data in memory cache]" Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.494087 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vj7n7" Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.547790 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fckz\" (UniqueName: \"kubernetes.io/projected/f0e02474-416e-4434-b482-2df56ae4c6a7-kube-api-access-5fckz\") pod \"f0e02474-416e-4434-b482-2df56ae4c6a7\" (UID: \"f0e02474-416e-4434-b482-2df56ae4c6a7\") " Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.548015 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e02474-416e-4434-b482-2df56ae4c6a7-catalog-content\") pod \"f0e02474-416e-4434-b482-2df56ae4c6a7\" (UID: \"f0e02474-416e-4434-b482-2df56ae4c6a7\") " Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.548095 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e02474-416e-4434-b482-2df56ae4c6a7-utilities\") pod \"f0e02474-416e-4434-b482-2df56ae4c6a7\" (UID: \"f0e02474-416e-4434-b482-2df56ae4c6a7\") " Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.548950 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0e02474-416e-4434-b482-2df56ae4c6a7-utilities" (OuterVolumeSpecName: "utilities") pod "f0e02474-416e-4434-b482-2df56ae4c6a7" (UID: "f0e02474-416e-4434-b482-2df56ae4c6a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.564693 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0e02474-416e-4434-b482-2df56ae4c6a7-kube-api-access-5fckz" (OuterVolumeSpecName: "kube-api-access-5fckz") pod "f0e02474-416e-4434-b482-2df56ae4c6a7" (UID: "f0e02474-416e-4434-b482-2df56ae4c6a7"). InnerVolumeSpecName "kube-api-access-5fckz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.580955 4775 generic.go:334] "Generic (PLEG): container finished" podID="f0e02474-416e-4434-b482-2df56ae4c6a7" containerID="feb35b471f5dc44215765b93c63ac6cea02aa217582fc92aee78cc5074d69757" exitCode=0 Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.581082 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vj7n7" Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.581071 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj7n7" event={"ID":"f0e02474-416e-4434-b482-2df56ae4c6a7","Type":"ContainerDied","Data":"feb35b471f5dc44215765b93c63ac6cea02aa217582fc92aee78cc5074d69757"} Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.581147 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj7n7" event={"ID":"f0e02474-416e-4434-b482-2df56ae4c6a7","Type":"ContainerDied","Data":"6c3da41871fb31447465411bc5aaa8f65ded6ff9ab5309b9ef5f56a1ffad3cfe"} Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.581172 4775 scope.go:117] "RemoveContainer" containerID="feb35b471f5dc44215765b93c63ac6cea02aa217582fc92aee78cc5074d69757" Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.643127 4775 scope.go:117] "RemoveContainer" containerID="58ce6bf10e5159cdb3c6497881ffecda7670412018bbc5f1082315bb5640868a" Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.650383 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fckz\" (UniqueName: \"kubernetes.io/projected/f0e02474-416e-4434-b482-2df56ae4c6a7-kube-api-access-5fckz\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.650410 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e02474-416e-4434-b482-2df56ae4c6a7-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.672201 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0e02474-416e-4434-b482-2df56ae4c6a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0e02474-416e-4434-b482-2df56ae4c6a7" (UID: "f0e02474-416e-4434-b482-2df56ae4c6a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.689536 4775 scope.go:117] "RemoveContainer" containerID="0bf3341889a29cfdc13f39ef3ffe9dc41cd1758174c2c5ef23f67e256b4e7350" Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.712328 4775 scope.go:117] "RemoveContainer" containerID="feb35b471f5dc44215765b93c63ac6cea02aa217582fc92aee78cc5074d69757" Dec 16 15:18:53 crc kubenswrapper[4775]: E1216 15:18:53.712870 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feb35b471f5dc44215765b93c63ac6cea02aa217582fc92aee78cc5074d69757\": container with ID starting with feb35b471f5dc44215765b93c63ac6cea02aa217582fc92aee78cc5074d69757 not found: ID does not exist" containerID="feb35b471f5dc44215765b93c63ac6cea02aa217582fc92aee78cc5074d69757" Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.712927 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feb35b471f5dc44215765b93c63ac6cea02aa217582fc92aee78cc5074d69757"} err="failed to get container status \"feb35b471f5dc44215765b93c63ac6cea02aa217582fc92aee78cc5074d69757\": rpc error: code = NotFound desc = could not find container \"feb35b471f5dc44215765b93c63ac6cea02aa217582fc92aee78cc5074d69757\": container with ID starting with feb35b471f5dc44215765b93c63ac6cea02aa217582fc92aee78cc5074d69757 not found: ID does not exist" Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.712959 4775 scope.go:117] "RemoveContainer" containerID="58ce6bf10e5159cdb3c6497881ffecda7670412018bbc5f1082315bb5640868a" Dec 16 15:18:53 crc kubenswrapper[4775]: E1216 15:18:53.713243 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58ce6bf10e5159cdb3c6497881ffecda7670412018bbc5f1082315bb5640868a\": container with ID starting with 58ce6bf10e5159cdb3c6497881ffecda7670412018bbc5f1082315bb5640868a not found: ID does not exist" containerID="58ce6bf10e5159cdb3c6497881ffecda7670412018bbc5f1082315bb5640868a" Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.713265 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ce6bf10e5159cdb3c6497881ffecda7670412018bbc5f1082315bb5640868a"} err="failed to get container status \"58ce6bf10e5159cdb3c6497881ffecda7670412018bbc5f1082315bb5640868a\": rpc error: code = NotFound desc = could not find container \"58ce6bf10e5159cdb3c6497881ffecda7670412018bbc5f1082315bb5640868a\": container with ID starting with 58ce6bf10e5159cdb3c6497881ffecda7670412018bbc5f1082315bb5640868a not found: ID does not exist" Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.713297 4775 scope.go:117] "RemoveContainer" containerID="0bf3341889a29cfdc13f39ef3ffe9dc41cd1758174c2c5ef23f67e256b4e7350" Dec 16 15:18:53 crc kubenswrapper[4775]: E1216 15:18:53.713548 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bf3341889a29cfdc13f39ef3ffe9dc41cd1758174c2c5ef23f67e256b4e7350\": container with ID starting with 0bf3341889a29cfdc13f39ef3ffe9dc41cd1758174c2c5ef23f67e256b4e7350 not found: ID does not exist" containerID="0bf3341889a29cfdc13f39ef3ffe9dc41cd1758174c2c5ef23f67e256b4e7350" Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.713567 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf3341889a29cfdc13f39ef3ffe9dc41cd1758174c2c5ef23f67e256b4e7350"} err="failed to get container status \"0bf3341889a29cfdc13f39ef3ffe9dc41cd1758174c2c5ef23f67e256b4e7350\": rpc error: code = NotFound desc = could not find container \"0bf3341889a29cfdc13f39ef3ffe9dc41cd1758174c2c5ef23f67e256b4e7350\": container with ID starting with 0bf3341889a29cfdc13f39ef3ffe9dc41cd1758174c2c5ef23f67e256b4e7350 not found: ID does not exist" Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.751743 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e02474-416e-4434-b482-2df56ae4c6a7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.921988 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vj7n7"] Dec 16 15:18:53 crc kubenswrapper[4775]: I1216 15:18:53.932813 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vj7n7"] Dec 16 15:18:55 crc kubenswrapper[4775]: I1216 15:18:55.348003 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0e02474-416e-4434-b482-2df56ae4c6a7" path="/var/lib/kubelet/pods/f0e02474-416e-4434-b482-2df56ae4c6a7/volumes" Dec 16 15:18:58 crc kubenswrapper[4775]: I1216 15:18:58.990858 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 15:18:58 crc kubenswrapper[4775]: I1216 15:18:58.991498 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 15:18:59 crc kubenswrapper[4775]: I1216 15:18:59.000708 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 15:18:59 crc kubenswrapper[4775]: I1216 15:18:59.005093 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 15:18:59 crc kubenswrapper[4775]: I1216 15:18:59.645158 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 15:18:59 crc kubenswrapper[4775]: I1216 15:18:59.646952 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 15:19:02 crc kubenswrapper[4775]: I1216 15:19:02.869369 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:19:02 crc kubenswrapper[4775]: I1216 15:19:02.869879 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:19:02 crc kubenswrapper[4775]: I1216 15:19:02.869956 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 15:19:02 crc kubenswrapper[4775]: I1216 15:19:02.870609 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"191baaa15580fc980936d4ebfb4d77ed829d99816880e694e3e02bd3ec00e6a9"} pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:19:02 crc kubenswrapper[4775]: I1216 15:19:02.870669 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" containerID="cri-o://191baaa15580fc980936d4ebfb4d77ed829d99816880e694e3e02bd3ec00e6a9" gracePeriod=600 Dec 16 15:19:03 crc kubenswrapper[4775]: I1216 15:19:03.677180 4775 generic.go:334] "Generic (PLEG): container finished" podID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerID="191baaa15580fc980936d4ebfb4d77ed829d99816880e694e3e02bd3ec00e6a9" exitCode=0 Dec 16 15:19:03 crc kubenswrapper[4775]: I1216 15:19:03.677244 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerDied","Data":"191baaa15580fc980936d4ebfb4d77ed829d99816880e694e3e02bd3ec00e6a9"} Dec 16 15:19:03 crc kubenswrapper[4775]: I1216 15:19:03.678026 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerStarted","Data":"0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda"} Dec 16 15:19:03 crc kubenswrapper[4775]: I1216 15:19:03.678058 4775 scope.go:117] "RemoveContainer" containerID="1a2e1ee11401b69f007c34bf8d81d1b271d0b2639b666040ec08a76eb20c628c" Dec 16 15:19:07 crc kubenswrapper[4775]: I1216 15:19:07.735744 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 15:19:08 crc kubenswrapper[4775]: I1216 15:19:08.681625 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 15:19:12 crc kubenswrapper[4775]: I1216 15:19:12.214244 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="79fbce0a-9f2b-4548-b886-de6dfe5ff245" containerName="rabbitmq" containerID="cri-o://a4159eb3eaf7d3182fc730fa78a7925b1f4d365d729ae0261021a1d8812d6dc5" gracePeriod=604796 Dec 16 15:19:12 crc kubenswrapper[4775]: I1216 15:19:12.954229 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="0451a266-fe64-4e36-93f7-9ebb1e547eec" containerName="rabbitmq" containerID="cri-o://bd14a367b3db152d794c63c2f4462c459b195bb462deb73243ee7ad2ca5594db" gracePeriod=604796 Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.813195 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.855585 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/79fbce0a-9f2b-4548-b886-de6dfe5ff245-erlang-cookie-secret\") pod \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.855663 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-tls\") pod \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.855704 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-plugins\") pod \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.855787 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/79fbce0a-9f2b-4548-b886-de6dfe5ff245-plugins-conf\") pod \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.855856 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-erlang-cookie\") pod \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.855972 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-confd\") pod \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.855994 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/79fbce0a-9f2b-4548-b886-de6dfe5ff245-pod-info\") pod \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.856033 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.856089 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79fbce0a-9f2b-4548-b886-de6dfe5ff245-config-data\") pod \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.856112 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gfvw\" (UniqueName: \"kubernetes.io/projected/79fbce0a-9f2b-4548-b886-de6dfe5ff245-kube-api-access-2gfvw\") pod \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.856151 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/79fbce0a-9f2b-4548-b886-de6dfe5ff245-server-conf\") pod \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\" (UID: \"79fbce0a-9f2b-4548-b886-de6dfe5ff245\") " Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.856788 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "79fbce0a-9f2b-4548-b886-de6dfe5ff245" (UID: "79fbce0a-9f2b-4548-b886-de6dfe5ff245"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.857195 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "79fbce0a-9f2b-4548-b886-de6dfe5ff245" (UID: "79fbce0a-9f2b-4548-b886-de6dfe5ff245"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.857235 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79fbce0a-9f2b-4548-b886-de6dfe5ff245-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "79fbce0a-9f2b-4548-b886-de6dfe5ff245" (UID: "79fbce0a-9f2b-4548-b886-de6dfe5ff245"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.863442 4775 generic.go:334] "Generic (PLEG): container finished" podID="79fbce0a-9f2b-4548-b886-de6dfe5ff245" containerID="a4159eb3eaf7d3182fc730fa78a7925b1f4d365d729ae0261021a1d8812d6dc5" exitCode=0 Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.863497 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"79fbce0a-9f2b-4548-b886-de6dfe5ff245","Type":"ContainerDied","Data":"a4159eb3eaf7d3182fc730fa78a7925b1f4d365d729ae0261021a1d8812d6dc5"} Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.863529 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"79fbce0a-9f2b-4548-b886-de6dfe5ff245","Type":"ContainerDied","Data":"5d33d206f49d875ad84b445bececae3b545b62ccef35a30702aefab81b775935"} Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.863548 4775 scope.go:117] "RemoveContainer" containerID="a4159eb3eaf7d3182fc730fa78a7925b1f4d365d729ae0261021a1d8812d6dc5" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.865031 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.865385 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/79fbce0a-9f2b-4548-b886-de6dfe5ff245-pod-info" (OuterVolumeSpecName: "pod-info") pod "79fbce0a-9f2b-4548-b886-de6dfe5ff245" (UID: "79fbce0a-9f2b-4548-b886-de6dfe5ff245"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.865464 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "79fbce0a-9f2b-4548-b886-de6dfe5ff245" (UID: "79fbce0a-9f2b-4548-b886-de6dfe5ff245"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.866535 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79fbce0a-9f2b-4548-b886-de6dfe5ff245-kube-api-access-2gfvw" (OuterVolumeSpecName: "kube-api-access-2gfvw") pod "79fbce0a-9f2b-4548-b886-de6dfe5ff245" (UID: "79fbce0a-9f2b-4548-b886-de6dfe5ff245"). InnerVolumeSpecName "kube-api-access-2gfvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.867989 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "79fbce0a-9f2b-4548-b886-de6dfe5ff245" (UID: "79fbce0a-9f2b-4548-b886-de6dfe5ff245"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.872090 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79fbce0a-9f2b-4548-b886-de6dfe5ff245-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "79fbce0a-9f2b-4548-b886-de6dfe5ff245" (UID: "79fbce0a-9f2b-4548-b886-de6dfe5ff245"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.897076 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79fbce0a-9f2b-4548-b886-de6dfe5ff245-config-data" (OuterVolumeSpecName: "config-data") pod "79fbce0a-9f2b-4548-b886-de6dfe5ff245" (UID: "79fbce0a-9f2b-4548-b886-de6dfe5ff245"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.958087 4775 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/79fbce0a-9f2b-4548-b886-de6dfe5ff245-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.958123 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.958139 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.958150 4775 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/79fbce0a-9f2b-4548-b886-de6dfe5ff245-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.958163 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.958173 4775 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/79fbce0a-9f2b-4548-b886-de6dfe5ff245-pod-info\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.958210 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.958223 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79fbce0a-9f2b-4548-b886-de6dfe5ff245-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.958236 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gfvw\" (UniqueName: \"kubernetes.io/projected/79fbce0a-9f2b-4548-b886-de6dfe5ff245-kube-api-access-2gfvw\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:18 crc kubenswrapper[4775]: I1216 15:19:18.965053 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79fbce0a-9f2b-4548-b886-de6dfe5ff245-server-conf" (OuterVolumeSpecName: "server-conf") pod "79fbce0a-9f2b-4548-b886-de6dfe5ff245" (UID: "79fbce0a-9f2b-4548-b886-de6dfe5ff245"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.002701 4775 scope.go:117] "RemoveContainer" containerID="790a4a60bbabbe93361bb85ff6f9a1546bd650f527fc5aeb455b91ee31cccce3" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.010690 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.056950 4775 scope.go:117] "RemoveContainer" containerID="a4159eb3eaf7d3182fc730fa78a7925b1f4d365d729ae0261021a1d8812d6dc5" Dec 16 15:19:19 crc kubenswrapper[4775]: E1216 15:19:19.057331 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4159eb3eaf7d3182fc730fa78a7925b1f4d365d729ae0261021a1d8812d6dc5\": container with ID starting with a4159eb3eaf7d3182fc730fa78a7925b1f4d365d729ae0261021a1d8812d6dc5 not found: ID does not exist" containerID="a4159eb3eaf7d3182fc730fa78a7925b1f4d365d729ae0261021a1d8812d6dc5" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.057363 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4159eb3eaf7d3182fc730fa78a7925b1f4d365d729ae0261021a1d8812d6dc5"} err="failed to get container status \"a4159eb3eaf7d3182fc730fa78a7925b1f4d365d729ae0261021a1d8812d6dc5\": rpc error: code = NotFound desc = could not find container \"a4159eb3eaf7d3182fc730fa78a7925b1f4d365d729ae0261021a1d8812d6dc5\": container with ID starting with a4159eb3eaf7d3182fc730fa78a7925b1f4d365d729ae0261021a1d8812d6dc5 not found: ID does not exist" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.057383 4775 scope.go:117] "RemoveContainer" containerID="790a4a60bbabbe93361bb85ff6f9a1546bd650f527fc5aeb455b91ee31cccce3" Dec 16 15:19:19 crc kubenswrapper[4775]: E1216 15:19:19.057593 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"790a4a60bbabbe93361bb85ff6f9a1546bd650f527fc5aeb455b91ee31cccce3\": container with ID starting with 790a4a60bbabbe93361bb85ff6f9a1546bd650f527fc5aeb455b91ee31cccce3 not found: ID does not exist" containerID="790a4a60bbabbe93361bb85ff6f9a1546bd650f527fc5aeb455b91ee31cccce3" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.057612 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"790a4a60bbabbe93361bb85ff6f9a1546bd650f527fc5aeb455b91ee31cccce3"} err="failed to get container status \"790a4a60bbabbe93361bb85ff6f9a1546bd650f527fc5aeb455b91ee31cccce3\": rpc error: code = NotFound desc = could not find container \"790a4a60bbabbe93361bb85ff6f9a1546bd650f527fc5aeb455b91ee31cccce3\": container with ID starting with 790a4a60bbabbe93361bb85ff6f9a1546bd650f527fc5aeb455b91ee31cccce3 not found: ID does not exist" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.063255 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.063291 4775 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/79fbce0a-9f2b-4548-b886-de6dfe5ff245-server-conf\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.072132 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "79fbce0a-9f2b-4548-b886-de6dfe5ff245" (UID: "79fbce0a-9f2b-4548-b886-de6dfe5ff245"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.165862 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/79fbce0a-9f2b-4548-b886-de6dfe5ff245-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.233172 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.255300 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.263682 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 15:19:19 crc kubenswrapper[4775]: E1216 15:19:19.264158 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fbce0a-9f2b-4548-b886-de6dfe5ff245" containerName="setup-container" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.264172 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fbce0a-9f2b-4548-b886-de6dfe5ff245" containerName="setup-container" Dec 16 15:19:19 crc kubenswrapper[4775]: E1216 15:19:19.264198 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e02474-416e-4434-b482-2df56ae4c6a7" containerName="extract-content" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.264203 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e02474-416e-4434-b482-2df56ae4c6a7" containerName="extract-content" Dec 16 15:19:19 crc kubenswrapper[4775]: E1216 15:19:19.264228 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fbce0a-9f2b-4548-b886-de6dfe5ff245" containerName="rabbitmq" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.264235 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fbce0a-9f2b-4548-b886-de6dfe5ff245" containerName="rabbitmq" Dec 16 15:19:19 crc kubenswrapper[4775]: E1216 15:19:19.264249 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e02474-416e-4434-b482-2df56ae4c6a7" containerName="extract-utilities" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.264255 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e02474-416e-4434-b482-2df56ae4c6a7" containerName="extract-utilities" Dec 16 15:19:19 crc kubenswrapper[4775]: E1216 15:19:19.264271 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e02474-416e-4434-b482-2df56ae4c6a7" containerName="registry-server" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.264276 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e02474-416e-4434-b482-2df56ae4c6a7" containerName="registry-server" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.264477 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="79fbce0a-9f2b-4548-b886-de6dfe5ff245" containerName="rabbitmq" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.264497 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0e02474-416e-4434-b482-2df56ae4c6a7" containerName="registry-server" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.265507 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.271645 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.282879 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.282931 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.283111 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.283693 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.283803 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.285388 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.286878 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-244wz" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.348667 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79fbce0a-9f2b-4548-b886-de6dfe5ff245" path="/var/lib/kubelet/pods/79fbce0a-9f2b-4548-b886-de6dfe5ff245/volumes" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.369128 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ba99f865-7192-4da9-8575-62d54a66d82e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.369195 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ba99f865-7192-4da9-8575-62d54a66d82e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.369254 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ba99f865-7192-4da9-8575-62d54a66d82e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.369284 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ba99f865-7192-4da9-8575-62d54a66d82e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.369352 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ba99f865-7192-4da9-8575-62d54a66d82e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.369380 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.369412 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ba99f865-7192-4da9-8575-62d54a66d82e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.369448 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ba99f865-7192-4da9-8575-62d54a66d82e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.369490 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ba99f865-7192-4da9-8575-62d54a66d82e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.369555 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2frg4\" (UniqueName: \"kubernetes.io/projected/ba99f865-7192-4da9-8575-62d54a66d82e-kube-api-access-2frg4\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.369586 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba99f865-7192-4da9-8575-62d54a66d82e-config-data\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.478518 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2frg4\" (UniqueName: \"kubernetes.io/projected/ba99f865-7192-4da9-8575-62d54a66d82e-kube-api-access-2frg4\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.478604 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba99f865-7192-4da9-8575-62d54a66d82e-config-data\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.478680 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ba99f865-7192-4da9-8575-62d54a66d82e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.478732 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ba99f865-7192-4da9-8575-62d54a66d82e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.478789 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ba99f865-7192-4da9-8575-62d54a66d82e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.478824 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ba99f865-7192-4da9-8575-62d54a66d82e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.478941 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ba99f865-7192-4da9-8575-62d54a66d82e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.478988 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.479035 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ba99f865-7192-4da9-8575-62d54a66d82e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.479077 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ba99f865-7192-4da9-8575-62d54a66d82e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.479134 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ba99f865-7192-4da9-8575-62d54a66d82e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.479897 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba99f865-7192-4da9-8575-62d54a66d82e-config-data\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.481326 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ba99f865-7192-4da9-8575-62d54a66d82e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.481783 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ba99f865-7192-4da9-8575-62d54a66d82e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.483064 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ba99f865-7192-4da9-8575-62d54a66d82e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.483362 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.487237 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ba99f865-7192-4da9-8575-62d54a66d82e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.490309 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ba99f865-7192-4da9-8575-62d54a66d82e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.504870 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ba99f865-7192-4da9-8575-62d54a66d82e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.506201 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ba99f865-7192-4da9-8575-62d54a66d82e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.510253 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ba99f865-7192-4da9-8575-62d54a66d82e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.512626 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2frg4\" (UniqueName: \"kubernetes.io/projected/ba99f865-7192-4da9-8575-62d54a66d82e-kube-api-access-2frg4\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.537303 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"ba99f865-7192-4da9-8575-62d54a66d82e\") " pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.599277 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.675869 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.783270 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-erlang-cookie\") pod \"0451a266-fe64-4e36-93f7-9ebb1e547eec\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.783375 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0451a266-fe64-4e36-93f7-9ebb1e547eec-erlang-cookie-secret\") pod \"0451a266-fe64-4e36-93f7-9ebb1e547eec\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.783422 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0451a266-fe64-4e36-93f7-9ebb1e547eec-server-conf\") pod \"0451a266-fe64-4e36-93f7-9ebb1e547eec\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.783492 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-tls\") pod \"0451a266-fe64-4e36-93f7-9ebb1e547eec\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.783524 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6ksm\" (UniqueName: \"kubernetes.io/projected/0451a266-fe64-4e36-93f7-9ebb1e547eec-kube-api-access-f6ksm\") pod \"0451a266-fe64-4e36-93f7-9ebb1e547eec\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.783569 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0451a266-fe64-4e36-93f7-9ebb1e547eec-plugins-conf\") pod \"0451a266-fe64-4e36-93f7-9ebb1e547eec\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.783639 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0451a266-fe64-4e36-93f7-9ebb1e547eec-pod-info\") pod \"0451a266-fe64-4e36-93f7-9ebb1e547eec\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.783681 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-confd\") pod \"0451a266-fe64-4e36-93f7-9ebb1e547eec\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.783704 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-plugins\") pod \"0451a266-fe64-4e36-93f7-9ebb1e547eec\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.783724 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"0451a266-fe64-4e36-93f7-9ebb1e547eec\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.783763 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0451a266-fe64-4e36-93f7-9ebb1e547eec-config-data\") pod \"0451a266-fe64-4e36-93f7-9ebb1e547eec\" (UID: \"0451a266-fe64-4e36-93f7-9ebb1e547eec\") " Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.786706 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0451a266-fe64-4e36-93f7-9ebb1e547eec" (UID: "0451a266-fe64-4e36-93f7-9ebb1e547eec"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.787018 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0451a266-fe64-4e36-93f7-9ebb1e547eec" (UID: "0451a266-fe64-4e36-93f7-9ebb1e547eec"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.788310 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0451a266-fe64-4e36-93f7-9ebb1e547eec-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0451a266-fe64-4e36-93f7-9ebb1e547eec" (UID: "0451a266-fe64-4e36-93f7-9ebb1e547eec"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.789173 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0451a266-fe64-4e36-93f7-9ebb1e547eec-kube-api-access-f6ksm" (OuterVolumeSpecName: "kube-api-access-f6ksm") pod "0451a266-fe64-4e36-93f7-9ebb1e547eec" (UID: "0451a266-fe64-4e36-93f7-9ebb1e547eec"). InnerVolumeSpecName "kube-api-access-f6ksm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.790919 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0451a266-fe64-4e36-93f7-9ebb1e547eec-pod-info" (OuterVolumeSpecName: "pod-info") pod "0451a266-fe64-4e36-93f7-9ebb1e547eec" (UID: "0451a266-fe64-4e36-93f7-9ebb1e547eec"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.791398 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0451a266-fe64-4e36-93f7-9ebb1e547eec-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0451a266-fe64-4e36-93f7-9ebb1e547eec" (UID: "0451a266-fe64-4e36-93f7-9ebb1e547eec"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.799377 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "0451a266-fe64-4e36-93f7-9ebb1e547eec" (UID: "0451a266-fe64-4e36-93f7-9ebb1e547eec"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.799500 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0451a266-fe64-4e36-93f7-9ebb1e547eec" (UID: "0451a266-fe64-4e36-93f7-9ebb1e547eec"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.835137 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0451a266-fe64-4e36-93f7-9ebb1e547eec-config-data" (OuterVolumeSpecName: "config-data") pod "0451a266-fe64-4e36-93f7-9ebb1e547eec" (UID: "0451a266-fe64-4e36-93f7-9ebb1e547eec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.882658 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0451a266-fe64-4e36-93f7-9ebb1e547eec-server-conf" (OuterVolumeSpecName: "server-conf") pod "0451a266-fe64-4e36-93f7-9ebb1e547eec" (UID: "0451a266-fe64-4e36-93f7-9ebb1e547eec"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.885828 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.885873 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.889370 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0451a266-fe64-4e36-93f7-9ebb1e547eec-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.889407 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.889421 4775 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0451a266-fe64-4e36-93f7-9ebb1e547eec-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.889434 4775 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0451a266-fe64-4e36-93f7-9ebb1e547eec-server-conf\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.889444 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.889456 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6ksm\" (UniqueName: \"kubernetes.io/projected/0451a266-fe64-4e36-93f7-9ebb1e547eec-kube-api-access-f6ksm\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.889468 4775 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0451a266-fe64-4e36-93f7-9ebb1e547eec-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.889478 4775 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0451a266-fe64-4e36-93f7-9ebb1e547eec-pod-info\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.889701 4775 generic.go:334] "Generic (PLEG): container finished" podID="0451a266-fe64-4e36-93f7-9ebb1e547eec" containerID="bd14a367b3db152d794c63c2f4462c459b195bb462deb73243ee7ad2ca5594db" exitCode=0 Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.889783 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0451a266-fe64-4e36-93f7-9ebb1e547eec","Type":"ContainerDied","Data":"bd14a367b3db152d794c63c2f4462c459b195bb462deb73243ee7ad2ca5594db"} Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.889815 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0451a266-fe64-4e36-93f7-9ebb1e547eec","Type":"ContainerDied","Data":"5f790c6338d10d60a544ec29df3dc224bbe7565c449092d0782500a3b7b0a9a1"} Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.889834 4775 scope.go:117] "RemoveContainer" containerID="bd14a367b3db152d794c63c2f4462c459b195bb462deb73243ee7ad2ca5594db" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.890042 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.912832 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.920582 4775 scope.go:117] "RemoveContainer" containerID="290cef72b957bb2ee1c39d2653f4f5bb1e67aa6a9764573ac57356610f089b95" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.940523 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0451a266-fe64-4e36-93f7-9ebb1e547eec" (UID: "0451a266-fe64-4e36-93f7-9ebb1e547eec"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.951938 4775 scope.go:117] "RemoveContainer" containerID="bd14a367b3db152d794c63c2f4462c459b195bb462deb73243ee7ad2ca5594db" Dec 16 15:19:19 crc kubenswrapper[4775]: E1216 15:19:19.952562 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd14a367b3db152d794c63c2f4462c459b195bb462deb73243ee7ad2ca5594db\": container with ID starting with bd14a367b3db152d794c63c2f4462c459b195bb462deb73243ee7ad2ca5594db not found: ID does not exist" containerID="bd14a367b3db152d794c63c2f4462c459b195bb462deb73243ee7ad2ca5594db" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.952607 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd14a367b3db152d794c63c2f4462c459b195bb462deb73243ee7ad2ca5594db"} err="failed to get container status \"bd14a367b3db152d794c63c2f4462c459b195bb462deb73243ee7ad2ca5594db\": rpc error: code = NotFound desc = could not find container \"bd14a367b3db152d794c63c2f4462c459b195bb462deb73243ee7ad2ca5594db\": container with ID starting with bd14a367b3db152d794c63c2f4462c459b195bb462deb73243ee7ad2ca5594db not found: ID does not exist" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.952639 4775 scope.go:117] "RemoveContainer" containerID="290cef72b957bb2ee1c39d2653f4f5bb1e67aa6a9764573ac57356610f089b95" Dec 16 15:19:19 crc kubenswrapper[4775]: E1216 15:19:19.952927 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"290cef72b957bb2ee1c39d2653f4f5bb1e67aa6a9764573ac57356610f089b95\": container with ID starting with 290cef72b957bb2ee1c39d2653f4f5bb1e67aa6a9764573ac57356610f089b95 not found: ID does not exist" containerID="290cef72b957bb2ee1c39d2653f4f5bb1e67aa6a9764573ac57356610f089b95" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.952947 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"290cef72b957bb2ee1c39d2653f4f5bb1e67aa6a9764573ac57356610f089b95"} err="failed to get container status \"290cef72b957bb2ee1c39d2653f4f5bb1e67aa6a9764573ac57356610f089b95\": rpc error: code = NotFound desc = could not find container \"290cef72b957bb2ee1c39d2653f4f5bb1e67aa6a9764573ac57356610f089b95\": container with ID starting with 290cef72b957bb2ee1c39d2653f4f5bb1e67aa6a9764573ac57356610f089b95 not found: ID does not exist" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.992036 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0451a266-fe64-4e36-93f7-9ebb1e547eec-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:19 crc kubenswrapper[4775]: I1216 15:19:19.992066 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.120970 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 15:19:20 crc kubenswrapper[4775]: W1216 15:19:20.122751 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba99f865_7192_4da9_8575_62d54a66d82e.slice/crio-15ef1721890465a3febedd607f1fc3ec4a7eb69b7031a016067a623f9f692e74 WatchSource:0}: Error finding container 15ef1721890465a3febedd607f1fc3ec4a7eb69b7031a016067a623f9f692e74: Status 404 returned error can't find the container with id 15ef1721890465a3febedd607f1fc3ec4a7eb69b7031a016067a623f9f692e74 Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.242982 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.253677 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.269560 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 15:19:20 crc kubenswrapper[4775]: E1216 15:19:20.270476 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0451a266-fe64-4e36-93f7-9ebb1e547eec" containerName="rabbitmq" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.270501 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0451a266-fe64-4e36-93f7-9ebb1e547eec" containerName="rabbitmq" Dec 16 15:19:20 crc kubenswrapper[4775]: E1216 15:19:20.270537 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0451a266-fe64-4e36-93f7-9ebb1e547eec" containerName="setup-container" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.270545 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0451a266-fe64-4e36-93f7-9ebb1e547eec" containerName="setup-container" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.270809 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0451a266-fe64-4e36-93f7-9ebb1e547eec" containerName="rabbitmq" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.271988 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.274344 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.274464 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.274569 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.274631 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.274654 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.274697 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.276606 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-qtj5d" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.289424 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.297195 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cf004dca-5d2e-4e4d-9c29-66b076fcc406-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.297244 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfw9d\" (UniqueName: \"kubernetes.io/projected/cf004dca-5d2e-4e4d-9c29-66b076fcc406-kube-api-access-rfw9d\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.297271 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cf004dca-5d2e-4e4d-9c29-66b076fcc406-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.297432 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cf004dca-5d2e-4e4d-9c29-66b076fcc406-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.297629 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cf004dca-5d2e-4e4d-9c29-66b076fcc406-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.297669 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cf004dca-5d2e-4e4d-9c29-66b076fcc406-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.297752 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.297934 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf004dca-5d2e-4e4d-9c29-66b076fcc406-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.298029 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cf004dca-5d2e-4e4d-9c29-66b076fcc406-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.298113 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cf004dca-5d2e-4e4d-9c29-66b076fcc406-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.298149 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cf004dca-5d2e-4e4d-9c29-66b076fcc406-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.399445 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cf004dca-5d2e-4e4d-9c29-66b076fcc406-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.399491 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cf004dca-5d2e-4e4d-9c29-66b076fcc406-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.399536 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.399588 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf004dca-5d2e-4e4d-9c29-66b076fcc406-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.399607 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cf004dca-5d2e-4e4d-9c29-66b076fcc406-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.399668 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cf004dca-5d2e-4e4d-9c29-66b076fcc406-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.399698 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cf004dca-5d2e-4e4d-9c29-66b076fcc406-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.399747 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cf004dca-5d2e-4e4d-9c29-66b076fcc406-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.399789 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfw9d\" (UniqueName: \"kubernetes.io/projected/cf004dca-5d2e-4e4d-9c29-66b076fcc406-kube-api-access-rfw9d\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.399826 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cf004dca-5d2e-4e4d-9c29-66b076fcc406-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.399933 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cf004dca-5d2e-4e4d-9c29-66b076fcc406-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.401032 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.401176 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cf004dca-5d2e-4e4d-9c29-66b076fcc406-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.402162 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cf004dca-5d2e-4e4d-9c29-66b076fcc406-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.403182 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf004dca-5d2e-4e4d-9c29-66b076fcc406-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.403660 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cf004dca-5d2e-4e4d-9c29-66b076fcc406-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.403952 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cf004dca-5d2e-4e4d-9c29-66b076fcc406-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.404180 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cf004dca-5d2e-4e4d-9c29-66b076fcc406-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.405761 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cf004dca-5d2e-4e4d-9c29-66b076fcc406-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.406308 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cf004dca-5d2e-4e4d-9c29-66b076fcc406-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.407083 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cf004dca-5d2e-4e4d-9c29-66b076fcc406-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.420518 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfw9d\" (UniqueName: \"kubernetes.io/projected/cf004dca-5d2e-4e4d-9c29-66b076fcc406-kube-api-access-rfw9d\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.441716 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cf004dca-5d2e-4e4d-9c29-66b076fcc406\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.596932 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:20 crc kubenswrapper[4775]: I1216 15:19:20.906852 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ba99f865-7192-4da9-8575-62d54a66d82e","Type":"ContainerStarted","Data":"15ef1721890465a3febedd607f1fc3ec4a7eb69b7031a016067a623f9f692e74"} Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.074350 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.377291 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0451a266-fe64-4e36-93f7-9ebb1e547eec" path="/var/lib/kubelet/pods/0451a266-fe64-4e36-93f7-9ebb1e547eec/volumes" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.475996 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-hxvjz"] Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.477638 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.480875 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.487930 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-hxvjz"] Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.518213 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.518316 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.518342 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.518437 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.518458 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55w2x\" (UniqueName: \"kubernetes.io/projected/4cb38560-d007-4226-a858-91afac6c6592-kube-api-access-55w2x\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.518489 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.518537 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-config\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.620380 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.620548 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.620620 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.620743 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.620769 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55w2x\" (UniqueName: \"kubernetes.io/projected/4cb38560-d007-4226-a858-91afac6c6592-kube-api-access-55w2x\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.620813 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.620868 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-config\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.621493 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.621578 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.621697 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.621772 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.622040 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.622237 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-config\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.710164 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55w2x\" (UniqueName: \"kubernetes.io/projected/4cb38560-d007-4226-a858-91afac6c6592-kube-api-access-55w2x\") pod \"dnsmasq-dns-7d84b4d45c-hxvjz\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.824655 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:21 crc kubenswrapper[4775]: I1216 15:19:21.943036 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cf004dca-5d2e-4e4d-9c29-66b076fcc406","Type":"ContainerStarted","Data":"262071fd23d5090a84c736aaf2713fae1c7a67d337629feb171ccb2e714067fa"} Dec 16 15:19:22 crc kubenswrapper[4775]: I1216 15:19:22.292971 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-hxvjz"] Dec 16 15:19:22 crc kubenswrapper[4775]: W1216 15:19:22.301823 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cb38560_d007_4226_a858_91afac6c6592.slice/crio-10611fb0eb394ed7cf3a982faa4138b0b3c78c43fd7ca4b85328492338e79185 WatchSource:0}: Error finding container 10611fb0eb394ed7cf3a982faa4138b0b3c78c43fd7ca4b85328492338e79185: Status 404 returned error can't find the container with id 10611fb0eb394ed7cf3a982faa4138b0b3c78c43fd7ca4b85328492338e79185 Dec 16 15:19:22 crc kubenswrapper[4775]: I1216 15:19:22.951494 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cf004dca-5d2e-4e4d-9c29-66b076fcc406","Type":"ContainerStarted","Data":"d6d722a6e1e586694a11e93f179519ae232185bc1f7fdb8e41f459b595f1c3e6"} Dec 16 15:19:22 crc kubenswrapper[4775]: I1216 15:19:22.953073 4775 generic.go:334] "Generic (PLEG): container finished" podID="4cb38560-d007-4226-a858-91afac6c6592" containerID="680e3d1ee2b7f11f6d0020309ab19b0f9853c289aed0b9b368cfeb2fa53f7793" exitCode=0 Dec 16 15:19:22 crc kubenswrapper[4775]: I1216 15:19:22.953196 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" event={"ID":"4cb38560-d007-4226-a858-91afac6c6592","Type":"ContainerDied","Data":"680e3d1ee2b7f11f6d0020309ab19b0f9853c289aed0b9b368cfeb2fa53f7793"} Dec 16 15:19:22 crc kubenswrapper[4775]: I1216 15:19:22.953231 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" event={"ID":"4cb38560-d007-4226-a858-91afac6c6592","Type":"ContainerStarted","Data":"10611fb0eb394ed7cf3a982faa4138b0b3c78c43fd7ca4b85328492338e79185"} Dec 16 15:19:22 crc kubenswrapper[4775]: I1216 15:19:22.955862 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ba99f865-7192-4da9-8575-62d54a66d82e","Type":"ContainerStarted","Data":"e4c359f73c9a3f9316aed03f86264ee088f62481382f44173e08e4926da63729"} Dec 16 15:19:23 crc kubenswrapper[4775]: I1216 15:19:23.977023 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" event={"ID":"4cb38560-d007-4226-a858-91afac6c6592","Type":"ContainerStarted","Data":"f736d6922af96d6c048e207090b5212bc15e4d15599346e311fa477d1c5762b4"} Dec 16 15:19:23 crc kubenswrapper[4775]: I1216 15:19:23.977850 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:24 crc kubenswrapper[4775]: I1216 15:19:24.011548 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" podStartSLOduration=3.011524122 podStartE2EDuration="3.011524122s" podCreationTimestamp="2025-12-16 15:19:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:19:23.995791806 +0000 UTC m=+1488.946870759" watchObservedRunningTime="2025-12-16 15:19:24.011524122 +0000 UTC m=+1488.962603065" Dec 16 15:19:31 crc kubenswrapper[4775]: I1216 15:19:31.827212 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:31 crc kubenswrapper[4775]: I1216 15:19:31.900753 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-69b7h"] Dec 16 15:19:31 crc kubenswrapper[4775]: I1216 15:19:31.901292 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" podUID="1b8b1fb4-1c9e-4920-aa5e-767437ef455f" containerName="dnsmasq-dns" containerID="cri-o://fb5ee18e6c468bbafb0421247f690fe08a9100a09726d0f904a05a89279b3712" gracePeriod=10 Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.062259 4775 generic.go:334] "Generic (PLEG): container finished" podID="1b8b1fb4-1c9e-4920-aa5e-767437ef455f" containerID="fb5ee18e6c468bbafb0421247f690fe08a9100a09726d0f904a05a89279b3712" exitCode=0 Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.062308 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" event={"ID":"1b8b1fb4-1c9e-4920-aa5e-767437ef455f","Type":"ContainerDied","Data":"fb5ee18e6c468bbafb0421247f690fe08a9100a09726d0f904a05a89279b3712"} Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.091586 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-f8mml"] Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.093254 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.106749 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-f8mml"] Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.249930 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb5b019a-c088-4515-91e1-a110d1ee04c9-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.250553 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7z6m\" (UniqueName: \"kubernetes.io/projected/cb5b019a-c088-4515-91e1-a110d1ee04c9-kube-api-access-s7z6m\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.250794 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb5b019a-c088-4515-91e1-a110d1ee04c9-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.250830 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb5b019a-c088-4515-91e1-a110d1ee04c9-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.251412 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cb5b019a-c088-4515-91e1-a110d1ee04c9-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.251662 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb5b019a-c088-4515-91e1-a110d1ee04c9-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.251698 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb5b019a-c088-4515-91e1-a110d1ee04c9-config\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.352960 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cb5b019a-c088-4515-91e1-a110d1ee04c9-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.353022 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb5b019a-c088-4515-91e1-a110d1ee04c9-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.353048 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb5b019a-c088-4515-91e1-a110d1ee04c9-config\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.353091 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb5b019a-c088-4515-91e1-a110d1ee04c9-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.353108 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7z6m\" (UniqueName: \"kubernetes.io/projected/cb5b019a-c088-4515-91e1-a110d1ee04c9-kube-api-access-s7z6m\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.353146 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb5b019a-c088-4515-91e1-a110d1ee04c9-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.353171 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb5b019a-c088-4515-91e1-a110d1ee04c9-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.354235 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb5b019a-c088-4515-91e1-a110d1ee04c9-config\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.354248 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb5b019a-c088-4515-91e1-a110d1ee04c9-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.354856 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb5b019a-c088-4515-91e1-a110d1ee04c9-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.355081 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb5b019a-c088-4515-91e1-a110d1ee04c9-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.355669 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb5b019a-c088-4515-91e1-a110d1ee04c9-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.358248 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cb5b019a-c088-4515-91e1-a110d1ee04c9-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.379120 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7z6m\" (UniqueName: \"kubernetes.io/projected/cb5b019a-c088-4515-91e1-a110d1ee04c9-kube-api-access-s7z6m\") pod \"dnsmasq-dns-6f6df4f56c-f8mml\" (UID: \"cb5b019a-c088-4515-91e1-a110d1ee04c9\") " pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.445639 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.576234 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.661496 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-dns-svc\") pod \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.661599 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-ovsdbserver-nb\") pod \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.661852 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-config\") pod \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.661996 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-dns-swift-storage-0\") pod \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.662122 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6sst\" (UniqueName: \"kubernetes.io/projected/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-kube-api-access-p6sst\") pod \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.662175 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-ovsdbserver-sb\") pod \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\" (UID: \"1b8b1fb4-1c9e-4920-aa5e-767437ef455f\") " Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.672811 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-kube-api-access-p6sst" (OuterVolumeSpecName: "kube-api-access-p6sst") pod "1b8b1fb4-1c9e-4920-aa5e-767437ef455f" (UID: "1b8b1fb4-1c9e-4920-aa5e-767437ef455f"). InnerVolumeSpecName "kube-api-access-p6sst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.719719 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1b8b1fb4-1c9e-4920-aa5e-767437ef455f" (UID: "1b8b1fb4-1c9e-4920-aa5e-767437ef455f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.726584 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1b8b1fb4-1c9e-4920-aa5e-767437ef455f" (UID: "1b8b1fb4-1c9e-4920-aa5e-767437ef455f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.726620 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1b8b1fb4-1c9e-4920-aa5e-767437ef455f" (UID: "1b8b1fb4-1c9e-4920-aa5e-767437ef455f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.728361 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1b8b1fb4-1c9e-4920-aa5e-767437ef455f" (UID: "1b8b1fb4-1c9e-4920-aa5e-767437ef455f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.730674 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-config" (OuterVolumeSpecName: "config") pod "1b8b1fb4-1c9e-4920-aa5e-767437ef455f" (UID: "1b8b1fb4-1c9e-4920-aa5e-767437ef455f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.764923 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.764952 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.764965 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.764974 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.764985 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6sst\" (UniqueName: \"kubernetes.io/projected/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-kube-api-access-p6sst\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.764994 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b8b1fb4-1c9e-4920-aa5e-767437ef455f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:32 crc kubenswrapper[4775]: I1216 15:19:32.942381 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-f8mml"] Dec 16 15:19:33 crc kubenswrapper[4775]: I1216 15:19:33.070366 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" event={"ID":"cb5b019a-c088-4515-91e1-a110d1ee04c9","Type":"ContainerStarted","Data":"75b19ed3479cff29a5c7957b13353e2dfa3ea90a9a2d5cb690ec95b76638b059"} Dec 16 15:19:33 crc kubenswrapper[4775]: I1216 15:19:33.072160 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" event={"ID":"1b8b1fb4-1c9e-4920-aa5e-767437ef455f","Type":"ContainerDied","Data":"b53fbe9d9ac25c1a1ffcdf4da293e5ee9fc6659459513827252c2dd877a68706"} Dec 16 15:19:33 crc kubenswrapper[4775]: I1216 15:19:33.072212 4775 scope.go:117] "RemoveContainer" containerID="fb5ee18e6c468bbafb0421247f690fe08a9100a09726d0f904a05a89279b3712" Dec 16 15:19:33 crc kubenswrapper[4775]: I1216 15:19:33.072478 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-69b7h" Dec 16 15:19:33 crc kubenswrapper[4775]: I1216 15:19:33.133014 4775 scope.go:117] "RemoveContainer" containerID="2e199d5870654836c2e3c777abea7c81f9f4e1a00a91978fdf2a0c4dbcaf1740" Dec 16 15:19:33 crc kubenswrapper[4775]: I1216 15:19:33.190719 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-69b7h"] Dec 16 15:19:33 crc kubenswrapper[4775]: I1216 15:19:33.206925 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-69b7h"] Dec 16 15:19:33 crc kubenswrapper[4775]: I1216 15:19:33.351563 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b8b1fb4-1c9e-4920-aa5e-767437ef455f" path="/var/lib/kubelet/pods/1b8b1fb4-1c9e-4920-aa5e-767437ef455f/volumes" Dec 16 15:19:34 crc kubenswrapper[4775]: I1216 15:19:34.082023 4775 generic.go:334] "Generic (PLEG): container finished" podID="cb5b019a-c088-4515-91e1-a110d1ee04c9" containerID="313f9ad20ef4698a807c6db5dd8cea91ad7dd1762f3b968a83fc60fd0e84f938" exitCode=0 Dec 16 15:19:34 crc kubenswrapper[4775]: I1216 15:19:34.082129 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" event={"ID":"cb5b019a-c088-4515-91e1-a110d1ee04c9","Type":"ContainerDied","Data":"313f9ad20ef4698a807c6db5dd8cea91ad7dd1762f3b968a83fc60fd0e84f938"} Dec 16 15:19:35 crc kubenswrapper[4775]: I1216 15:19:35.096337 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" event={"ID":"cb5b019a-c088-4515-91e1-a110d1ee04c9","Type":"ContainerStarted","Data":"8243bf7c7b9b3aa571fa534af210d4e407741555b09fd06a2049d7d1a8cd0c14"} Dec 16 15:19:35 crc kubenswrapper[4775]: I1216 15:19:35.096922 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:35 crc kubenswrapper[4775]: I1216 15:19:35.119174 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" podStartSLOduration=3.119157742 podStartE2EDuration="3.119157742s" podCreationTimestamp="2025-12-16 15:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:19:35.11691306 +0000 UTC m=+1500.067991993" watchObservedRunningTime="2025-12-16 15:19:35.119157742 +0000 UTC m=+1500.070236665" Dec 16 15:19:42 crc kubenswrapper[4775]: I1216 15:19:42.450186 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6df4f56c-f8mml" Dec 16 15:19:42 crc kubenswrapper[4775]: I1216 15:19:42.516402 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-hxvjz"] Dec 16 15:19:42 crc kubenswrapper[4775]: I1216 15:19:42.516624 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" podUID="4cb38560-d007-4226-a858-91afac6c6592" containerName="dnsmasq-dns" containerID="cri-o://f736d6922af96d6c048e207090b5212bc15e4d15599346e311fa477d1c5762b4" gracePeriod=10 Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.176135 4775 generic.go:334] "Generic (PLEG): container finished" podID="4cb38560-d007-4226-a858-91afac6c6592" containerID="f736d6922af96d6c048e207090b5212bc15e4d15599346e311fa477d1c5762b4" exitCode=0 Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.176181 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" event={"ID":"4cb38560-d007-4226-a858-91afac6c6592","Type":"ContainerDied","Data":"f736d6922af96d6c048e207090b5212bc15e4d15599346e311fa477d1c5762b4"} Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.489812 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.592641 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-config\") pod \"4cb38560-d007-4226-a858-91afac6c6592\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.593460 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-ovsdbserver-sb\") pod \"4cb38560-d007-4226-a858-91afac6c6592\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.593507 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55w2x\" (UniqueName: \"kubernetes.io/projected/4cb38560-d007-4226-a858-91afac6c6592-kube-api-access-55w2x\") pod \"4cb38560-d007-4226-a858-91afac6c6592\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.593706 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-dns-svc\") pod \"4cb38560-d007-4226-a858-91afac6c6592\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.593734 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-ovsdbserver-nb\") pod \"4cb38560-d007-4226-a858-91afac6c6592\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.593782 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-dns-swift-storage-0\") pod \"4cb38560-d007-4226-a858-91afac6c6592\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.593801 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-openstack-edpm-ipam\") pod \"4cb38560-d007-4226-a858-91afac6c6592\" (UID: \"4cb38560-d007-4226-a858-91afac6c6592\") " Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.604107 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cb38560-d007-4226-a858-91afac6c6592-kube-api-access-55w2x" (OuterVolumeSpecName: "kube-api-access-55w2x") pod "4cb38560-d007-4226-a858-91afac6c6592" (UID: "4cb38560-d007-4226-a858-91afac6c6592"). InnerVolumeSpecName "kube-api-access-55w2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.646799 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "4cb38560-d007-4226-a858-91afac6c6592" (UID: "4cb38560-d007-4226-a858-91afac6c6592"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.647714 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4cb38560-d007-4226-a858-91afac6c6592" (UID: "4cb38560-d007-4226-a858-91afac6c6592"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.647735 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4cb38560-d007-4226-a858-91afac6c6592" (UID: "4cb38560-d007-4226-a858-91afac6c6592"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.650523 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4cb38560-d007-4226-a858-91afac6c6592" (UID: "4cb38560-d007-4226-a858-91afac6c6592"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.665048 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4cb38560-d007-4226-a858-91afac6c6592" (UID: "4cb38560-d007-4226-a858-91afac6c6592"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.665919 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-config" (OuterVolumeSpecName: "config") pod "4cb38560-d007-4226-a858-91afac6c6592" (UID: "4cb38560-d007-4226-a858-91afac6c6592"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.696023 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.696056 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.696068 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.696079 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55w2x\" (UniqueName: \"kubernetes.io/projected/4cb38560-d007-4226-a858-91afac6c6592-kube-api-access-55w2x\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.696092 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.696102 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:43 crc kubenswrapper[4775]: I1216 15:19:43.696112 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cb38560-d007-4226-a858-91afac6c6592-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:44 crc kubenswrapper[4775]: I1216 15:19:44.187824 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" event={"ID":"4cb38560-d007-4226-a858-91afac6c6592","Type":"ContainerDied","Data":"10611fb0eb394ed7cf3a982faa4138b0b3c78c43fd7ca4b85328492338e79185"} Dec 16 15:19:44 crc kubenswrapper[4775]: I1216 15:19:44.187906 4775 scope.go:117] "RemoveContainer" containerID="f736d6922af96d6c048e207090b5212bc15e4d15599346e311fa477d1c5762b4" Dec 16 15:19:44 crc kubenswrapper[4775]: I1216 15:19:44.187926 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-hxvjz" Dec 16 15:19:44 crc kubenswrapper[4775]: I1216 15:19:44.210931 4775 scope.go:117] "RemoveContainer" containerID="680e3d1ee2b7f11f6d0020309ab19b0f9853c289aed0b9b368cfeb2fa53f7793" Dec 16 15:19:44 crc kubenswrapper[4775]: I1216 15:19:44.216067 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-hxvjz"] Dec 16 15:19:44 crc kubenswrapper[4775]: I1216 15:19:44.225140 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-hxvjz"] Dec 16 15:19:45 crc kubenswrapper[4775]: I1216 15:19:45.350032 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cb38560-d007-4226-a858-91afac6c6592" path="/var/lib/kubelet/pods/4cb38560-d007-4226-a858-91afac6c6592/volumes" Dec 16 15:19:49 crc kubenswrapper[4775]: I1216 15:19:49.834817 4775 scope.go:117] "RemoveContainer" containerID="d9863ea985ec9d0dd4a2331755a36d6fec94ef841e316040f21a5123ca2ec3e7" Dec 16 15:19:49 crc kubenswrapper[4775]: I1216 15:19:49.856267 4775 scope.go:117] "RemoveContainer" containerID="c4c31c48bb9cd30ad452a8748b9b30ad2b3cca8757deced584d706b39ded3fc8" Dec 16 15:19:49 crc kubenswrapper[4775]: I1216 15:19:49.882110 4775 scope.go:117] "RemoveContainer" containerID="25ddee36ccfbe183230bbd7501a720ba2e998da9c8bedf15e4cbf26eefa929c9" Dec 16 15:19:49 crc kubenswrapper[4775]: I1216 15:19:49.934055 4775 scope.go:117] "RemoveContainer" containerID="b42bd2c4189538f0ca529cc0da217eaf64d3d2d3399570a407a815d7c9236da2" Dec 16 15:19:54 crc kubenswrapper[4775]: I1216 15:19:54.301267 4775 generic.go:334] "Generic (PLEG): container finished" podID="ba99f865-7192-4da9-8575-62d54a66d82e" containerID="e4c359f73c9a3f9316aed03f86264ee088f62481382f44173e08e4926da63729" exitCode=0 Dec 16 15:19:54 crc kubenswrapper[4775]: I1216 15:19:54.301367 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ba99f865-7192-4da9-8575-62d54a66d82e","Type":"ContainerDied","Data":"e4c359f73c9a3f9316aed03f86264ee088f62481382f44173e08e4926da63729"} Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.313796 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ba99f865-7192-4da9-8575-62d54a66d82e","Type":"ContainerStarted","Data":"bfe9e9229c60745f72aaaa991fca5e2cf348ae8eac57a075b755073f921e178e"} Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.315716 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.318070 4775 generic.go:334] "Generic (PLEG): container finished" podID="cf004dca-5d2e-4e4d-9c29-66b076fcc406" containerID="d6d722a6e1e586694a11e93f179519ae232185bc1f7fdb8e41f459b595f1c3e6" exitCode=0 Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.318117 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cf004dca-5d2e-4e4d-9c29-66b076fcc406","Type":"ContainerDied","Data":"d6d722a6e1e586694a11e93f179519ae232185bc1f7fdb8e41f459b595f1c3e6"} Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.344554 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.344536947 podStartE2EDuration="36.344536947s" podCreationTimestamp="2025-12-16 15:19:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:19:55.341017955 +0000 UTC m=+1520.292096928" watchObservedRunningTime="2025-12-16 15:19:55.344536947 +0000 UTC m=+1520.295615870" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.567552 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5"] Dec 16 15:19:55 crc kubenswrapper[4775]: E1216 15:19:55.567957 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb38560-d007-4226-a858-91afac6c6592" containerName="init" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.567975 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb38560-d007-4226-a858-91afac6c6592" containerName="init" Dec 16 15:19:55 crc kubenswrapper[4775]: E1216 15:19:55.567990 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b8b1fb4-1c9e-4920-aa5e-767437ef455f" containerName="init" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.567997 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8b1fb4-1c9e-4920-aa5e-767437ef455f" containerName="init" Dec 16 15:19:55 crc kubenswrapper[4775]: E1216 15:19:55.568012 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb38560-d007-4226-a858-91afac6c6592" containerName="dnsmasq-dns" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.568018 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb38560-d007-4226-a858-91afac6c6592" containerName="dnsmasq-dns" Dec 16 15:19:55 crc kubenswrapper[4775]: E1216 15:19:55.568050 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b8b1fb4-1c9e-4920-aa5e-767437ef455f" containerName="dnsmasq-dns" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.568056 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8b1fb4-1c9e-4920-aa5e-767437ef455f" containerName="dnsmasq-dns" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.568289 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cb38560-d007-4226-a858-91afac6c6592" containerName="dnsmasq-dns" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.568312 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b8b1fb4-1c9e-4920-aa5e-767437ef455f" containerName="dnsmasq-dns" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.568933 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.583302 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.584702 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tgv5f" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.585013 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.589180 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.614568 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5"] Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.738642 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q6w5\" (UniqueName: \"kubernetes.io/projected/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-kube-api-access-2q6w5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5\" (UID: \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.739055 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5\" (UID: \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.739088 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5\" (UID: \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.739160 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5\" (UID: \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.840360 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5\" (UID: \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.840435 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5\" (UID: \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.840493 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5\" (UID: \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.840568 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q6w5\" (UniqueName: \"kubernetes.io/projected/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-kube-api-access-2q6w5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5\" (UID: \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.845834 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5\" (UID: \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.846419 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5\" (UID: \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.847574 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5\" (UID: \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.864734 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q6w5\" (UniqueName: \"kubernetes.io/projected/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-kube-api-access-2q6w5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5\" (UID: \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" Dec 16 15:19:55 crc kubenswrapper[4775]: I1216 15:19:55.890446 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" Dec 16 15:19:56 crc kubenswrapper[4775]: I1216 15:19:56.328255 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cf004dca-5d2e-4e4d-9c29-66b076fcc406","Type":"ContainerStarted","Data":"cc4ad6272aefe43809274c526246b3cdbafc72802aa2b82b2e2e024b6de15437"} Dec 16 15:19:56 crc kubenswrapper[4775]: I1216 15:19:56.329055 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:56 crc kubenswrapper[4775]: I1216 15:19:56.362153 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.362130335 podStartE2EDuration="36.362130335s" podCreationTimestamp="2025-12-16 15:19:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:19:56.351798127 +0000 UTC m=+1521.302877070" watchObservedRunningTime="2025-12-16 15:19:56.362130335 +0000 UTC m=+1521.313209258" Dec 16 15:19:56 crc kubenswrapper[4775]: I1216 15:19:56.499656 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5"] Dec 16 15:19:57 crc kubenswrapper[4775]: I1216 15:19:57.348160 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" event={"ID":"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9","Type":"ContainerStarted","Data":"e1ea5f89f3981ae60c784a9b96235844ebe2cde8c39c8cf1d8abe2846ff9e849"} Dec 16 15:20:01 crc kubenswrapper[4775]: I1216 15:20:01.349353 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-69gkk"] Dec 16 15:20:01 crc kubenswrapper[4775]: I1216 15:20:01.351399 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69gkk" Dec 16 15:20:01 crc kubenswrapper[4775]: I1216 15:20:01.366860 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-69gkk"] Dec 16 15:20:01 crc kubenswrapper[4775]: I1216 15:20:01.450112 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d-utilities\") pod \"redhat-marketplace-69gkk\" (UID: \"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d\") " pod="openshift-marketplace/redhat-marketplace-69gkk" Dec 16 15:20:01 crc kubenswrapper[4775]: I1216 15:20:01.450159 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d-catalog-content\") pod \"redhat-marketplace-69gkk\" (UID: \"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d\") " pod="openshift-marketplace/redhat-marketplace-69gkk" Dec 16 15:20:01 crc kubenswrapper[4775]: I1216 15:20:01.450644 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9lj2\" (UniqueName: \"kubernetes.io/projected/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d-kube-api-access-l9lj2\") pod \"redhat-marketplace-69gkk\" (UID: \"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d\") " pod="openshift-marketplace/redhat-marketplace-69gkk" Dec 16 15:20:01 crc kubenswrapper[4775]: I1216 15:20:01.552527 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d-utilities\") pod \"redhat-marketplace-69gkk\" (UID: \"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d\") " pod="openshift-marketplace/redhat-marketplace-69gkk" Dec 16 15:20:01 crc kubenswrapper[4775]: I1216 15:20:01.552580 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d-catalog-content\") pod \"redhat-marketplace-69gkk\" (UID: \"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d\") " pod="openshift-marketplace/redhat-marketplace-69gkk" Dec 16 15:20:01 crc kubenswrapper[4775]: I1216 15:20:01.553082 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d-utilities\") pod \"redhat-marketplace-69gkk\" (UID: \"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d\") " pod="openshift-marketplace/redhat-marketplace-69gkk" Dec 16 15:20:01 crc kubenswrapper[4775]: I1216 15:20:01.553048 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d-catalog-content\") pod \"redhat-marketplace-69gkk\" (UID: \"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d\") " pod="openshift-marketplace/redhat-marketplace-69gkk" Dec 16 15:20:01 crc kubenswrapper[4775]: I1216 15:20:01.553322 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9lj2\" (UniqueName: \"kubernetes.io/projected/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d-kube-api-access-l9lj2\") pod \"redhat-marketplace-69gkk\" (UID: \"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d\") " pod="openshift-marketplace/redhat-marketplace-69gkk" Dec 16 15:20:01 crc kubenswrapper[4775]: I1216 15:20:01.575737 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9lj2\" (UniqueName: \"kubernetes.io/projected/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d-kube-api-access-l9lj2\") pod \"redhat-marketplace-69gkk\" (UID: \"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d\") " pod="openshift-marketplace/redhat-marketplace-69gkk" Dec 16 15:20:01 crc kubenswrapper[4775]: I1216 15:20:01.709980 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69gkk" Dec 16 15:20:05 crc kubenswrapper[4775]: W1216 15:20:05.351359 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ed801a1_4fd1_4788_ae7f_7b16bb06b86d.slice/crio-8307a22e778553c6c732fe3912f7a4ec81c13aa74983b0cad9743bb6fd69df3d WatchSource:0}: Error finding container 8307a22e778553c6c732fe3912f7a4ec81c13aa74983b0cad9743bb6fd69df3d: Status 404 returned error can't find the container with id 8307a22e778553c6c732fe3912f7a4ec81c13aa74983b0cad9743bb6fd69df3d Dec 16 15:20:05 crc kubenswrapper[4775]: I1216 15:20:05.354588 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-69gkk"] Dec 16 15:20:05 crc kubenswrapper[4775]: I1216 15:20:05.424087 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" event={"ID":"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9","Type":"ContainerStarted","Data":"2aa36b857ac8a7df3e904e768718ad8b43833015548781a1a7e1e9567a84f099"} Dec 16 15:20:05 crc kubenswrapper[4775]: I1216 15:20:05.425269 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69gkk" event={"ID":"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d","Type":"ContainerStarted","Data":"8307a22e778553c6c732fe3912f7a4ec81c13aa74983b0cad9743bb6fd69df3d"} Dec 16 15:20:05 crc kubenswrapper[4775]: I1216 15:20:05.442389 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" podStartSLOduration=1.970068472 podStartE2EDuration="10.442368736s" podCreationTimestamp="2025-12-16 15:19:55 +0000 UTC" firstStartedPulling="2025-12-16 15:19:56.512427943 +0000 UTC m=+1521.463506866" lastFinishedPulling="2025-12-16 15:20:04.984728207 +0000 UTC m=+1529.935807130" observedRunningTime="2025-12-16 15:20:05.441320343 +0000 UTC m=+1530.392399286" watchObservedRunningTime="2025-12-16 15:20:05.442368736 +0000 UTC m=+1530.393447659" Dec 16 15:20:06 crc kubenswrapper[4775]: I1216 15:20:06.436294 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69gkk" event={"ID":"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d","Type":"ContainerDied","Data":"20764053a8cd3770a2dfff06b1af91dd885d3b30d57c6136cf323eee46b089d1"} Dec 16 15:20:06 crc kubenswrapper[4775]: I1216 15:20:06.436162 4775 generic.go:334] "Generic (PLEG): container finished" podID="4ed801a1-4fd1-4788-ae7f-7b16bb06b86d" containerID="20764053a8cd3770a2dfff06b1af91dd885d3b30d57c6136cf323eee46b089d1" exitCode=0 Dec 16 15:20:08 crc kubenswrapper[4775]: I1216 15:20:08.457658 4775 generic.go:334] "Generic (PLEG): container finished" podID="4ed801a1-4fd1-4788-ae7f-7b16bb06b86d" containerID="599f2c11ebc7e904d5c3c8b1d3c5018234a4f488f478cfacc5c18fa72dc276c7" exitCode=0 Dec 16 15:20:08 crc kubenswrapper[4775]: I1216 15:20:08.457703 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69gkk" event={"ID":"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d","Type":"ContainerDied","Data":"599f2c11ebc7e904d5c3c8b1d3c5018234a4f488f478cfacc5c18fa72dc276c7"} Dec 16 15:20:09 crc kubenswrapper[4775]: I1216 15:20:09.468578 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69gkk" event={"ID":"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d","Type":"ContainerStarted","Data":"356c9f2059a005c8132d06f09bd2a7df102f49891633f2564f6bbd2d9a36229d"} Dec 16 15:20:09 crc kubenswrapper[4775]: I1216 15:20:09.489580 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-69gkk" podStartSLOduration=6.026364176 podStartE2EDuration="8.489558291s" podCreationTimestamp="2025-12-16 15:20:01 +0000 UTC" firstStartedPulling="2025-12-16 15:20:06.439125563 +0000 UTC m=+1531.390204476" lastFinishedPulling="2025-12-16 15:20:08.902319668 +0000 UTC m=+1533.853398591" observedRunningTime="2025-12-16 15:20:09.484880132 +0000 UTC m=+1534.435959075" watchObservedRunningTime="2025-12-16 15:20:09.489558291 +0000 UTC m=+1534.440637214" Dec 16 15:20:09 crc kubenswrapper[4775]: I1216 15:20:09.604086 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 16 15:20:10 crc kubenswrapper[4775]: I1216 15:20:10.601131 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:20:11 crc kubenswrapper[4775]: I1216 15:20:11.710187 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-69gkk" Dec 16 15:20:11 crc kubenswrapper[4775]: I1216 15:20:11.711812 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-69gkk" Dec 16 15:20:11 crc kubenswrapper[4775]: I1216 15:20:11.762162 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-69gkk" Dec 16 15:20:17 crc kubenswrapper[4775]: I1216 15:20:17.568921 4775 generic.go:334] "Generic (PLEG): container finished" podID="023c8812-4f2e-4b64-85c7-eabd4ed3d7f9" containerID="2aa36b857ac8a7df3e904e768718ad8b43833015548781a1a7e1e9567a84f099" exitCode=0 Dec 16 15:20:17 crc kubenswrapper[4775]: I1216 15:20:17.569024 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" event={"ID":"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9","Type":"ContainerDied","Data":"2aa36b857ac8a7df3e904e768718ad8b43833015548781a1a7e1e9567a84f099"} Dec 16 15:20:18 crc kubenswrapper[4775]: I1216 15:20:18.804586 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zr7qg"] Dec 16 15:20:18 crc kubenswrapper[4775]: I1216 15:20:18.807564 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zr7qg" Dec 16 15:20:18 crc kubenswrapper[4775]: I1216 15:20:18.820638 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zr7qg"] Dec 16 15:20:18 crc kubenswrapper[4775]: I1216 15:20:18.916998 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0e3803-78d9-421b-8268-6fb0c05280c5-catalog-content\") pod \"certified-operators-zr7qg\" (UID: \"5b0e3803-78d9-421b-8268-6fb0c05280c5\") " pod="openshift-marketplace/certified-operators-zr7qg" Dec 16 15:20:18 crc kubenswrapper[4775]: I1216 15:20:18.917047 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w89m5\" (UniqueName: \"kubernetes.io/projected/5b0e3803-78d9-421b-8268-6fb0c05280c5-kube-api-access-w89m5\") pod \"certified-operators-zr7qg\" (UID: \"5b0e3803-78d9-421b-8268-6fb0c05280c5\") " pod="openshift-marketplace/certified-operators-zr7qg" Dec 16 15:20:18 crc kubenswrapper[4775]: I1216 15:20:18.917120 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0e3803-78d9-421b-8268-6fb0c05280c5-utilities\") pod \"certified-operators-zr7qg\" (UID: \"5b0e3803-78d9-421b-8268-6fb0c05280c5\") " pod="openshift-marketplace/certified-operators-zr7qg" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.023220 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0e3803-78d9-421b-8268-6fb0c05280c5-catalog-content\") pod \"certified-operators-zr7qg\" (UID: \"5b0e3803-78d9-421b-8268-6fb0c05280c5\") " pod="openshift-marketplace/certified-operators-zr7qg" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.023280 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w89m5\" (UniqueName: \"kubernetes.io/projected/5b0e3803-78d9-421b-8268-6fb0c05280c5-kube-api-access-w89m5\") pod \"certified-operators-zr7qg\" (UID: \"5b0e3803-78d9-421b-8268-6fb0c05280c5\") " pod="openshift-marketplace/certified-operators-zr7qg" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.023383 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0e3803-78d9-421b-8268-6fb0c05280c5-utilities\") pod \"certified-operators-zr7qg\" (UID: \"5b0e3803-78d9-421b-8268-6fb0c05280c5\") " pod="openshift-marketplace/certified-operators-zr7qg" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.024016 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0e3803-78d9-421b-8268-6fb0c05280c5-utilities\") pod \"certified-operators-zr7qg\" (UID: \"5b0e3803-78d9-421b-8268-6fb0c05280c5\") " pod="openshift-marketplace/certified-operators-zr7qg" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.024419 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0e3803-78d9-421b-8268-6fb0c05280c5-catalog-content\") pod \"certified-operators-zr7qg\" (UID: \"5b0e3803-78d9-421b-8268-6fb0c05280c5\") " pod="openshift-marketplace/certified-operators-zr7qg" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.060530 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w89m5\" (UniqueName: \"kubernetes.io/projected/5b0e3803-78d9-421b-8268-6fb0c05280c5-kube-api-access-w89m5\") pod \"certified-operators-zr7qg\" (UID: \"5b0e3803-78d9-421b-8268-6fb0c05280c5\") " pod="openshift-marketplace/certified-operators-zr7qg" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.141343 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zr7qg" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.179141 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.329023 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-inventory\") pod \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\" (UID: \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\") " Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.329130 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q6w5\" (UniqueName: \"kubernetes.io/projected/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-kube-api-access-2q6w5\") pod \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\" (UID: \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\") " Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.329319 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-repo-setup-combined-ca-bundle\") pod \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\" (UID: \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\") " Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.329400 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-ssh-key\") pod \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\" (UID: \"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9\") " Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.334580 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "023c8812-4f2e-4b64-85c7-eabd4ed3d7f9" (UID: "023c8812-4f2e-4b64-85c7-eabd4ed3d7f9"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.334702 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-kube-api-access-2q6w5" (OuterVolumeSpecName: "kube-api-access-2q6w5") pod "023c8812-4f2e-4b64-85c7-eabd4ed3d7f9" (UID: "023c8812-4f2e-4b64-85c7-eabd4ed3d7f9"). InnerVolumeSpecName "kube-api-access-2q6w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.370013 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-inventory" (OuterVolumeSpecName: "inventory") pod "023c8812-4f2e-4b64-85c7-eabd4ed3d7f9" (UID: "023c8812-4f2e-4b64-85c7-eabd4ed3d7f9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.400849 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "023c8812-4f2e-4b64-85c7-eabd4ed3d7f9" (UID: "023c8812-4f2e-4b64-85c7-eabd4ed3d7f9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.432004 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.432039 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q6w5\" (UniqueName: \"kubernetes.io/projected/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-kube-api-access-2q6w5\") on node \"crc\" DevicePath \"\"" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.432053 4775 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.432062 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/023c8812-4f2e-4b64-85c7-eabd4ed3d7f9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.586861 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" event={"ID":"023c8812-4f2e-4b64-85c7-eabd4ed3d7f9","Type":"ContainerDied","Data":"e1ea5f89f3981ae60c784a9b96235844ebe2cde8c39c8cf1d8abe2846ff9e849"} Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.586939 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1ea5f89f3981ae60c784a9b96235844ebe2cde8c39c8cf1d8abe2846ff9e849" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.586968 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.663390 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p"] Dec 16 15:20:19 crc kubenswrapper[4775]: E1216 15:20:19.663799 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="023c8812-4f2e-4b64-85c7-eabd4ed3d7f9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.663818 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="023c8812-4f2e-4b64-85c7-eabd4ed3d7f9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.664050 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="023c8812-4f2e-4b64-85c7-eabd4ed3d7f9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.672612 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.674694 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.674943 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.675110 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tgv5f" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.675304 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.675672 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p"] Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.733834 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zr7qg"] Dec 16 15:20:19 crc kubenswrapper[4775]: W1216 15:20:19.734336 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b0e3803_78d9_421b_8268_6fb0c05280c5.slice/crio-c5c34dd0d73dad93ba17a38d5cb4b1536fcfee5ce667d98d78fc39b4f6e3ebc0 WatchSource:0}: Error finding container c5c34dd0d73dad93ba17a38d5cb4b1536fcfee5ce667d98d78fc39b4f6e3ebc0: Status 404 returned error can't find the container with id c5c34dd0d73dad93ba17a38d5cb4b1536fcfee5ce667d98d78fc39b4f6e3ebc0 Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.742676 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/586695ef-512d-4d00-b127-751849932aef-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-r5l7p\" (UID: \"586695ef-512d-4d00-b127-751849932aef\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.742727 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/586695ef-512d-4d00-b127-751849932aef-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-r5l7p\" (UID: \"586695ef-512d-4d00-b127-751849932aef\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.742826 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th7cs\" (UniqueName: \"kubernetes.io/projected/586695ef-512d-4d00-b127-751849932aef-kube-api-access-th7cs\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-r5l7p\" (UID: \"586695ef-512d-4d00-b127-751849932aef\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.844211 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/586695ef-512d-4d00-b127-751849932aef-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-r5l7p\" (UID: \"586695ef-512d-4d00-b127-751849932aef\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.844648 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th7cs\" (UniqueName: \"kubernetes.io/projected/586695ef-512d-4d00-b127-751849932aef-kube-api-access-th7cs\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-r5l7p\" (UID: \"586695ef-512d-4d00-b127-751849932aef\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.844750 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/586695ef-512d-4d00-b127-751849932aef-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-r5l7p\" (UID: \"586695ef-512d-4d00-b127-751849932aef\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.851598 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/586695ef-512d-4d00-b127-751849932aef-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-r5l7p\" (UID: \"586695ef-512d-4d00-b127-751849932aef\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.851599 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/586695ef-512d-4d00-b127-751849932aef-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-r5l7p\" (UID: \"586695ef-512d-4d00-b127-751849932aef\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p" Dec 16 15:20:19 crc kubenswrapper[4775]: I1216 15:20:19.863855 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th7cs\" (UniqueName: \"kubernetes.io/projected/586695ef-512d-4d00-b127-751849932aef-kube-api-access-th7cs\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-r5l7p\" (UID: \"586695ef-512d-4d00-b127-751849932aef\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p" Dec 16 15:20:20 crc kubenswrapper[4775]: I1216 15:20:20.004829 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p" Dec 16 15:20:20 crc kubenswrapper[4775]: I1216 15:20:20.599229 4775 generic.go:334] "Generic (PLEG): container finished" podID="5b0e3803-78d9-421b-8268-6fb0c05280c5" containerID="f9cba3196e7d1a95c215ced4ff64feb09097d6a281f7106e2e38a5ed948dd77b" exitCode=0 Dec 16 15:20:20 crc kubenswrapper[4775]: I1216 15:20:20.599339 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zr7qg" event={"ID":"5b0e3803-78d9-421b-8268-6fb0c05280c5","Type":"ContainerDied","Data":"f9cba3196e7d1a95c215ced4ff64feb09097d6a281f7106e2e38a5ed948dd77b"} Dec 16 15:20:20 crc kubenswrapper[4775]: I1216 15:20:20.599625 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zr7qg" event={"ID":"5b0e3803-78d9-421b-8268-6fb0c05280c5","Type":"ContainerStarted","Data":"c5c34dd0d73dad93ba17a38d5cb4b1536fcfee5ce667d98d78fc39b4f6e3ebc0"} Dec 16 15:20:21 crc kubenswrapper[4775]: I1216 15:20:21.225577 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p"] Dec 16 15:20:21 crc kubenswrapper[4775]: I1216 15:20:21.610699 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p" event={"ID":"586695ef-512d-4d00-b127-751849932aef","Type":"ContainerStarted","Data":"14b4083e28b837ebbf2d05fa053d44b92cb8ba7bcbc716521d88f34befd7e5dc"} Dec 16 15:20:21 crc kubenswrapper[4775]: I1216 15:20:21.755120 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-69gkk" Dec 16 15:20:22 crc kubenswrapper[4775]: I1216 15:20:22.620667 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p" event={"ID":"586695ef-512d-4d00-b127-751849932aef","Type":"ContainerStarted","Data":"9794724302f0462640af533511b7a796e47bbe07e22a2cd9a2f2e4aac24afc28"} Dec 16 15:20:22 crc kubenswrapper[4775]: I1216 15:20:22.622993 4775 generic.go:334] "Generic (PLEG): container finished" podID="5b0e3803-78d9-421b-8268-6fb0c05280c5" containerID="c84bff04e84df7be4ab4bda2523e4f67142dfa04d571d02828e5031de67d25fa" exitCode=0 Dec 16 15:20:22 crc kubenswrapper[4775]: I1216 15:20:22.623037 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zr7qg" event={"ID":"5b0e3803-78d9-421b-8268-6fb0c05280c5","Type":"ContainerDied","Data":"c84bff04e84df7be4ab4bda2523e4f67142dfa04d571d02828e5031de67d25fa"} Dec 16 15:20:22 crc kubenswrapper[4775]: I1216 15:20:22.647243 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p" podStartSLOduration=2.997603026 podStartE2EDuration="3.647222023s" podCreationTimestamp="2025-12-16 15:20:19 +0000 UTC" firstStartedPulling="2025-12-16 15:20:21.233808495 +0000 UTC m=+1546.184887418" lastFinishedPulling="2025-12-16 15:20:21.883427492 +0000 UTC m=+1546.834506415" observedRunningTime="2025-12-16 15:20:22.643151344 +0000 UTC m=+1547.594230257" watchObservedRunningTime="2025-12-16 15:20:22.647222023 +0000 UTC m=+1547.598300946" Dec 16 15:20:23 crc kubenswrapper[4775]: I1216 15:20:23.519804 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-69gkk"] Dec 16 15:20:23 crc kubenswrapper[4775]: I1216 15:20:23.520050 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-69gkk" podUID="4ed801a1-4fd1-4788-ae7f-7b16bb06b86d" containerName="registry-server" containerID="cri-o://356c9f2059a005c8132d06f09bd2a7df102f49891633f2564f6bbd2d9a36229d" gracePeriod=2 Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.523614 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69gkk" Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.650426 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9lj2\" (UniqueName: \"kubernetes.io/projected/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d-kube-api-access-l9lj2\") pod \"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d\" (UID: \"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d\") " Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.650654 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d-catalog-content\") pod \"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d\" (UID: \"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d\") " Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.650755 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d-utilities\") pod \"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d\" (UID: \"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d\") " Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.653683 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d-utilities" (OuterVolumeSpecName: "utilities") pod "4ed801a1-4fd1-4788-ae7f-7b16bb06b86d" (UID: "4ed801a1-4fd1-4788-ae7f-7b16bb06b86d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.657717 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zr7qg" event={"ID":"5b0e3803-78d9-421b-8268-6fb0c05280c5","Type":"ContainerStarted","Data":"fdc4ecc7b73f88e718a511cfb96b954fcc89baf87f14a39d040b4ca25b3f35e6"} Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.660756 4775 generic.go:334] "Generic (PLEG): container finished" podID="4ed801a1-4fd1-4788-ae7f-7b16bb06b86d" containerID="356c9f2059a005c8132d06f09bd2a7df102f49891633f2564f6bbd2d9a36229d" exitCode=0 Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.660816 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69gkk" event={"ID":"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d","Type":"ContainerDied","Data":"356c9f2059a005c8132d06f09bd2a7df102f49891633f2564f6bbd2d9a36229d"} Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.660852 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69gkk" event={"ID":"4ed801a1-4fd1-4788-ae7f-7b16bb06b86d","Type":"ContainerDied","Data":"8307a22e778553c6c732fe3912f7a4ec81c13aa74983b0cad9743bb6fd69df3d"} Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.660879 4775 scope.go:117] "RemoveContainer" containerID="356c9f2059a005c8132d06f09bd2a7df102f49891633f2564f6bbd2d9a36229d" Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.661080 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69gkk" Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.668485 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d-kube-api-access-l9lj2" (OuterVolumeSpecName: "kube-api-access-l9lj2") pod "4ed801a1-4fd1-4788-ae7f-7b16bb06b86d" (UID: "4ed801a1-4fd1-4788-ae7f-7b16bb06b86d"). InnerVolumeSpecName "kube-api-access-l9lj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.675825 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ed801a1-4fd1-4788-ae7f-7b16bb06b86d" (UID: "4ed801a1-4fd1-4788-ae7f-7b16bb06b86d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.683559 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zr7qg" podStartSLOduration=3.322700089 podStartE2EDuration="6.683540383s" podCreationTimestamp="2025-12-16 15:20:18 +0000 UTC" firstStartedPulling="2025-12-16 15:20:20.601075232 +0000 UTC m=+1545.552154155" lastFinishedPulling="2025-12-16 15:20:23.961915526 +0000 UTC m=+1548.912994449" observedRunningTime="2025-12-16 15:20:24.679218636 +0000 UTC m=+1549.630297569" watchObservedRunningTime="2025-12-16 15:20:24.683540383 +0000 UTC m=+1549.634619306" Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.724418 4775 scope.go:117] "RemoveContainer" containerID="599f2c11ebc7e904d5c3c8b1d3c5018234a4f488f478cfacc5c18fa72dc276c7" Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.745097 4775 scope.go:117] "RemoveContainer" containerID="20764053a8cd3770a2dfff06b1af91dd885d3b30d57c6136cf323eee46b089d1" Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.754435 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9lj2\" (UniqueName: \"kubernetes.io/projected/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d-kube-api-access-l9lj2\") on node \"crc\" DevicePath \"\"" Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.754479 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.754490 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.800461 4775 scope.go:117] "RemoveContainer" containerID="356c9f2059a005c8132d06f09bd2a7df102f49891633f2564f6bbd2d9a36229d" Dec 16 15:20:24 crc kubenswrapper[4775]: E1216 15:20:24.801171 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"356c9f2059a005c8132d06f09bd2a7df102f49891633f2564f6bbd2d9a36229d\": container with ID starting with 356c9f2059a005c8132d06f09bd2a7df102f49891633f2564f6bbd2d9a36229d not found: ID does not exist" containerID="356c9f2059a005c8132d06f09bd2a7df102f49891633f2564f6bbd2d9a36229d" Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.801227 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"356c9f2059a005c8132d06f09bd2a7df102f49891633f2564f6bbd2d9a36229d"} err="failed to get container status \"356c9f2059a005c8132d06f09bd2a7df102f49891633f2564f6bbd2d9a36229d\": rpc error: code = NotFound desc = could not find container \"356c9f2059a005c8132d06f09bd2a7df102f49891633f2564f6bbd2d9a36229d\": container with ID starting with 356c9f2059a005c8132d06f09bd2a7df102f49891633f2564f6bbd2d9a36229d not found: ID does not exist" Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.801263 4775 scope.go:117] "RemoveContainer" containerID="599f2c11ebc7e904d5c3c8b1d3c5018234a4f488f478cfacc5c18fa72dc276c7" Dec 16 15:20:24 crc kubenswrapper[4775]: E1216 15:20:24.801752 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"599f2c11ebc7e904d5c3c8b1d3c5018234a4f488f478cfacc5c18fa72dc276c7\": container with ID starting with 599f2c11ebc7e904d5c3c8b1d3c5018234a4f488f478cfacc5c18fa72dc276c7 not found: ID does not exist" containerID="599f2c11ebc7e904d5c3c8b1d3c5018234a4f488f478cfacc5c18fa72dc276c7" Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.801781 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"599f2c11ebc7e904d5c3c8b1d3c5018234a4f488f478cfacc5c18fa72dc276c7"} err="failed to get container status \"599f2c11ebc7e904d5c3c8b1d3c5018234a4f488f478cfacc5c18fa72dc276c7\": rpc error: code = NotFound desc = could not find container \"599f2c11ebc7e904d5c3c8b1d3c5018234a4f488f478cfacc5c18fa72dc276c7\": container with ID starting with 599f2c11ebc7e904d5c3c8b1d3c5018234a4f488f478cfacc5c18fa72dc276c7 not found: ID does not exist" Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.801795 4775 scope.go:117] "RemoveContainer" containerID="20764053a8cd3770a2dfff06b1af91dd885d3b30d57c6136cf323eee46b089d1" Dec 16 15:20:24 crc kubenswrapper[4775]: E1216 15:20:24.802084 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20764053a8cd3770a2dfff06b1af91dd885d3b30d57c6136cf323eee46b089d1\": container with ID starting with 20764053a8cd3770a2dfff06b1af91dd885d3b30d57c6136cf323eee46b089d1 not found: ID does not exist" containerID="20764053a8cd3770a2dfff06b1af91dd885d3b30d57c6136cf323eee46b089d1" Dec 16 15:20:24 crc kubenswrapper[4775]: I1216 15:20:24.802128 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20764053a8cd3770a2dfff06b1af91dd885d3b30d57c6136cf323eee46b089d1"} err="failed to get container status \"20764053a8cd3770a2dfff06b1af91dd885d3b30d57c6136cf323eee46b089d1\": rpc error: code = NotFound desc = could not find container \"20764053a8cd3770a2dfff06b1af91dd885d3b30d57c6136cf323eee46b089d1\": container with ID starting with 20764053a8cd3770a2dfff06b1af91dd885d3b30d57c6136cf323eee46b089d1 not found: ID does not exist" Dec 16 15:20:25 crc kubenswrapper[4775]: I1216 15:20:25.002381 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-69gkk"] Dec 16 15:20:25 crc kubenswrapper[4775]: I1216 15:20:25.009244 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-69gkk"] Dec 16 15:20:25 crc kubenswrapper[4775]: I1216 15:20:25.347046 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ed801a1-4fd1-4788-ae7f-7b16bb06b86d" path="/var/lib/kubelet/pods/4ed801a1-4fd1-4788-ae7f-7b16bb06b86d/volumes" Dec 16 15:20:25 crc kubenswrapper[4775]: I1216 15:20:25.673034 4775 generic.go:334] "Generic (PLEG): container finished" podID="586695ef-512d-4d00-b127-751849932aef" containerID="9794724302f0462640af533511b7a796e47bbe07e22a2cd9a2f2e4aac24afc28" exitCode=0 Dec 16 15:20:25 crc kubenswrapper[4775]: I1216 15:20:25.673123 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p" event={"ID":"586695ef-512d-4d00-b127-751849932aef","Type":"ContainerDied","Data":"9794724302f0462640af533511b7a796e47bbe07e22a2cd9a2f2e4aac24afc28"} Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.150488 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.302278 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/586695ef-512d-4d00-b127-751849932aef-inventory\") pod \"586695ef-512d-4d00-b127-751849932aef\" (UID: \"586695ef-512d-4d00-b127-751849932aef\") " Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.302463 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th7cs\" (UniqueName: \"kubernetes.io/projected/586695ef-512d-4d00-b127-751849932aef-kube-api-access-th7cs\") pod \"586695ef-512d-4d00-b127-751849932aef\" (UID: \"586695ef-512d-4d00-b127-751849932aef\") " Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.302522 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/586695ef-512d-4d00-b127-751849932aef-ssh-key\") pod \"586695ef-512d-4d00-b127-751849932aef\" (UID: \"586695ef-512d-4d00-b127-751849932aef\") " Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.314329 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/586695ef-512d-4d00-b127-751849932aef-kube-api-access-th7cs" (OuterVolumeSpecName: "kube-api-access-th7cs") pod "586695ef-512d-4d00-b127-751849932aef" (UID: "586695ef-512d-4d00-b127-751849932aef"). InnerVolumeSpecName "kube-api-access-th7cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.332279 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586695ef-512d-4d00-b127-751849932aef-inventory" (OuterVolumeSpecName: "inventory") pod "586695ef-512d-4d00-b127-751849932aef" (UID: "586695ef-512d-4d00-b127-751849932aef"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.339522 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586695ef-512d-4d00-b127-751849932aef-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "586695ef-512d-4d00-b127-751849932aef" (UID: "586695ef-512d-4d00-b127-751849932aef"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.404750 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th7cs\" (UniqueName: \"kubernetes.io/projected/586695ef-512d-4d00-b127-751849932aef-kube-api-access-th7cs\") on node \"crc\" DevicePath \"\"" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.404777 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/586695ef-512d-4d00-b127-751849932aef-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.404787 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/586695ef-512d-4d00-b127-751849932aef-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.693489 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p" event={"ID":"586695ef-512d-4d00-b127-751849932aef","Type":"ContainerDied","Data":"14b4083e28b837ebbf2d05fa053d44b92cb8ba7bcbc716521d88f34befd7e5dc"} Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.693542 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14b4083e28b837ebbf2d05fa053d44b92cb8ba7bcbc716521d88f34befd7e5dc" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.693570 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r5l7p" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.782275 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f"] Dec 16 15:20:27 crc kubenswrapper[4775]: E1216 15:20:27.783006 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586695ef-512d-4d00-b127-751849932aef" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.783120 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="586695ef-512d-4d00-b127-751849932aef" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 16 15:20:27 crc kubenswrapper[4775]: E1216 15:20:27.783219 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed801a1-4fd1-4788-ae7f-7b16bb06b86d" containerName="registry-server" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.783295 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed801a1-4fd1-4788-ae7f-7b16bb06b86d" containerName="registry-server" Dec 16 15:20:27 crc kubenswrapper[4775]: E1216 15:20:27.783361 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed801a1-4fd1-4788-ae7f-7b16bb06b86d" containerName="extract-utilities" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.783419 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed801a1-4fd1-4788-ae7f-7b16bb06b86d" containerName="extract-utilities" Dec 16 15:20:27 crc kubenswrapper[4775]: E1216 15:20:27.783527 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed801a1-4fd1-4788-ae7f-7b16bb06b86d" containerName="extract-content" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.783588 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed801a1-4fd1-4788-ae7f-7b16bb06b86d" containerName="extract-content" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.783863 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="586695ef-512d-4d00-b127-751849932aef" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.783951 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ed801a1-4fd1-4788-ae7f-7b16bb06b86d" containerName="registry-server" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.784859 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.798334 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.798392 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.798436 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.798345 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tgv5f" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.806307 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f"] Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.917399 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b2d1ae7-ec42-4c6c-9400-966f2093d883-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f\" (UID: \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.917531 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq2mf\" (UniqueName: \"kubernetes.io/projected/2b2d1ae7-ec42-4c6c-9400-966f2093d883-kube-api-access-bq2mf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f\" (UID: \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.917680 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b2d1ae7-ec42-4c6c-9400-966f2093d883-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f\" (UID: \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" Dec 16 15:20:27 crc kubenswrapper[4775]: I1216 15:20:27.917798 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b2d1ae7-ec42-4c6c-9400-966f2093d883-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f\" (UID: \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" Dec 16 15:20:28 crc kubenswrapper[4775]: I1216 15:20:28.019428 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b2d1ae7-ec42-4c6c-9400-966f2093d883-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f\" (UID: \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" Dec 16 15:20:28 crc kubenswrapper[4775]: I1216 15:20:28.019551 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b2d1ae7-ec42-4c6c-9400-966f2093d883-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f\" (UID: \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" Dec 16 15:20:28 crc kubenswrapper[4775]: I1216 15:20:28.019599 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq2mf\" (UniqueName: \"kubernetes.io/projected/2b2d1ae7-ec42-4c6c-9400-966f2093d883-kube-api-access-bq2mf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f\" (UID: \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" Dec 16 15:20:28 crc kubenswrapper[4775]: I1216 15:20:28.019678 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b2d1ae7-ec42-4c6c-9400-966f2093d883-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f\" (UID: \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" Dec 16 15:20:28 crc kubenswrapper[4775]: I1216 15:20:28.023969 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b2d1ae7-ec42-4c6c-9400-966f2093d883-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f\" (UID: \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" Dec 16 15:20:28 crc kubenswrapper[4775]: I1216 15:20:28.029437 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b2d1ae7-ec42-4c6c-9400-966f2093d883-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f\" (UID: \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" Dec 16 15:20:28 crc kubenswrapper[4775]: I1216 15:20:28.029503 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b2d1ae7-ec42-4c6c-9400-966f2093d883-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f\" (UID: \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" Dec 16 15:20:28 crc kubenswrapper[4775]: I1216 15:20:28.036651 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq2mf\" (UniqueName: \"kubernetes.io/projected/2b2d1ae7-ec42-4c6c-9400-966f2093d883-kube-api-access-bq2mf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f\" (UID: \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" Dec 16 15:20:28 crc kubenswrapper[4775]: I1216 15:20:28.114342 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" Dec 16 15:20:28 crc kubenswrapper[4775]: I1216 15:20:28.620294 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f"] Dec 16 15:20:28 crc kubenswrapper[4775]: W1216 15:20:28.634186 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b2d1ae7_ec42_4c6c_9400_966f2093d883.slice/crio-15bfd1fb2bdd4d7c66866608a4fc70839f7e271d750a2919aa4201500b8c31a8 WatchSource:0}: Error finding container 15bfd1fb2bdd4d7c66866608a4fc70839f7e271d750a2919aa4201500b8c31a8: Status 404 returned error can't find the container with id 15bfd1fb2bdd4d7c66866608a4fc70839f7e271d750a2919aa4201500b8c31a8 Dec 16 15:20:28 crc kubenswrapper[4775]: I1216 15:20:28.703464 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" event={"ID":"2b2d1ae7-ec42-4c6c-9400-966f2093d883","Type":"ContainerStarted","Data":"15bfd1fb2bdd4d7c66866608a4fc70839f7e271d750a2919aa4201500b8c31a8"} Dec 16 15:20:29 crc kubenswrapper[4775]: I1216 15:20:29.142126 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zr7qg" Dec 16 15:20:29 crc kubenswrapper[4775]: I1216 15:20:29.142176 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zr7qg" Dec 16 15:20:29 crc kubenswrapper[4775]: I1216 15:20:29.192614 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zr7qg" Dec 16 15:20:29 crc kubenswrapper[4775]: I1216 15:20:29.714149 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" event={"ID":"2b2d1ae7-ec42-4c6c-9400-966f2093d883","Type":"ContainerStarted","Data":"a79f29f24f63d17a44c8612e7677cd623f4c59760efb9cce071259f6e0a93f8b"} Dec 16 15:20:29 crc kubenswrapper[4775]: I1216 15:20:29.738079 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" podStartSLOduration=2.252083974 podStartE2EDuration="2.73805931s" podCreationTimestamp="2025-12-16 15:20:27 +0000 UTC" firstStartedPulling="2025-12-16 15:20:28.636101322 +0000 UTC m=+1553.587180245" lastFinishedPulling="2025-12-16 15:20:29.122076658 +0000 UTC m=+1554.073155581" observedRunningTime="2025-12-16 15:20:29.726734391 +0000 UTC m=+1554.677813334" watchObservedRunningTime="2025-12-16 15:20:29.73805931 +0000 UTC m=+1554.689138223" Dec 16 15:20:29 crc kubenswrapper[4775]: I1216 15:20:29.779746 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zr7qg" Dec 16 15:20:29 crc kubenswrapper[4775]: I1216 15:20:29.834397 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zr7qg"] Dec 16 15:20:31 crc kubenswrapper[4775]: I1216 15:20:31.732441 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zr7qg" podUID="5b0e3803-78d9-421b-8268-6fb0c05280c5" containerName="registry-server" containerID="cri-o://fdc4ecc7b73f88e718a511cfb96b954fcc89baf87f14a39d040b4ca25b3f35e6" gracePeriod=2 Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.183629 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zr7qg" Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.314250 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0e3803-78d9-421b-8268-6fb0c05280c5-utilities\") pod \"5b0e3803-78d9-421b-8268-6fb0c05280c5\" (UID: \"5b0e3803-78d9-421b-8268-6fb0c05280c5\") " Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.314330 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w89m5\" (UniqueName: \"kubernetes.io/projected/5b0e3803-78d9-421b-8268-6fb0c05280c5-kube-api-access-w89m5\") pod \"5b0e3803-78d9-421b-8268-6fb0c05280c5\" (UID: \"5b0e3803-78d9-421b-8268-6fb0c05280c5\") " Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.314572 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0e3803-78d9-421b-8268-6fb0c05280c5-catalog-content\") pod \"5b0e3803-78d9-421b-8268-6fb0c05280c5\" (UID: \"5b0e3803-78d9-421b-8268-6fb0c05280c5\") " Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.315111 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b0e3803-78d9-421b-8268-6fb0c05280c5-utilities" (OuterVolumeSpecName: "utilities") pod "5b0e3803-78d9-421b-8268-6fb0c05280c5" (UID: "5b0e3803-78d9-421b-8268-6fb0c05280c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.325574 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b0e3803-78d9-421b-8268-6fb0c05280c5-kube-api-access-w89m5" (OuterVolumeSpecName: "kube-api-access-w89m5") pod "5b0e3803-78d9-421b-8268-6fb0c05280c5" (UID: "5b0e3803-78d9-421b-8268-6fb0c05280c5"). InnerVolumeSpecName "kube-api-access-w89m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.367241 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b0e3803-78d9-421b-8268-6fb0c05280c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b0e3803-78d9-421b-8268-6fb0c05280c5" (UID: "5b0e3803-78d9-421b-8268-6fb0c05280c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.417097 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0e3803-78d9-421b-8268-6fb0c05280c5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.417463 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0e3803-78d9-421b-8268-6fb0c05280c5-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.417746 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w89m5\" (UniqueName: \"kubernetes.io/projected/5b0e3803-78d9-421b-8268-6fb0c05280c5-kube-api-access-w89m5\") on node \"crc\" DevicePath \"\"" Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.744790 4775 generic.go:334] "Generic (PLEG): container finished" podID="5b0e3803-78d9-421b-8268-6fb0c05280c5" containerID="fdc4ecc7b73f88e718a511cfb96b954fcc89baf87f14a39d040b4ca25b3f35e6" exitCode=0 Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.744841 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zr7qg" event={"ID":"5b0e3803-78d9-421b-8268-6fb0c05280c5","Type":"ContainerDied","Data":"fdc4ecc7b73f88e718a511cfb96b954fcc89baf87f14a39d040b4ca25b3f35e6"} Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.745740 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zr7qg" event={"ID":"5b0e3803-78d9-421b-8268-6fb0c05280c5","Type":"ContainerDied","Data":"c5c34dd0d73dad93ba17a38d5cb4b1536fcfee5ce667d98d78fc39b4f6e3ebc0"} Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.744971 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zr7qg" Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.745768 4775 scope.go:117] "RemoveContainer" containerID="fdc4ecc7b73f88e718a511cfb96b954fcc89baf87f14a39d040b4ca25b3f35e6" Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.781786 4775 scope.go:117] "RemoveContainer" containerID="c84bff04e84df7be4ab4bda2523e4f67142dfa04d571d02828e5031de67d25fa" Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.783408 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zr7qg"] Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.790721 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zr7qg"] Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.807502 4775 scope.go:117] "RemoveContainer" containerID="f9cba3196e7d1a95c215ced4ff64feb09097d6a281f7106e2e38a5ed948dd77b" Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.870608 4775 scope.go:117] "RemoveContainer" containerID="fdc4ecc7b73f88e718a511cfb96b954fcc89baf87f14a39d040b4ca25b3f35e6" Dec 16 15:20:32 crc kubenswrapper[4775]: E1216 15:20:32.871114 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdc4ecc7b73f88e718a511cfb96b954fcc89baf87f14a39d040b4ca25b3f35e6\": container with ID starting with fdc4ecc7b73f88e718a511cfb96b954fcc89baf87f14a39d040b4ca25b3f35e6 not found: ID does not exist" containerID="fdc4ecc7b73f88e718a511cfb96b954fcc89baf87f14a39d040b4ca25b3f35e6" Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.871146 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdc4ecc7b73f88e718a511cfb96b954fcc89baf87f14a39d040b4ca25b3f35e6"} err="failed to get container status \"fdc4ecc7b73f88e718a511cfb96b954fcc89baf87f14a39d040b4ca25b3f35e6\": rpc error: code = NotFound desc = could not find container \"fdc4ecc7b73f88e718a511cfb96b954fcc89baf87f14a39d040b4ca25b3f35e6\": container with ID starting with fdc4ecc7b73f88e718a511cfb96b954fcc89baf87f14a39d040b4ca25b3f35e6 not found: ID does not exist" Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.871170 4775 scope.go:117] "RemoveContainer" containerID="c84bff04e84df7be4ab4bda2523e4f67142dfa04d571d02828e5031de67d25fa" Dec 16 15:20:32 crc kubenswrapper[4775]: E1216 15:20:32.871545 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c84bff04e84df7be4ab4bda2523e4f67142dfa04d571d02828e5031de67d25fa\": container with ID starting with c84bff04e84df7be4ab4bda2523e4f67142dfa04d571d02828e5031de67d25fa not found: ID does not exist" containerID="c84bff04e84df7be4ab4bda2523e4f67142dfa04d571d02828e5031de67d25fa" Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.871570 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c84bff04e84df7be4ab4bda2523e4f67142dfa04d571d02828e5031de67d25fa"} err="failed to get container status \"c84bff04e84df7be4ab4bda2523e4f67142dfa04d571d02828e5031de67d25fa\": rpc error: code = NotFound desc = could not find container \"c84bff04e84df7be4ab4bda2523e4f67142dfa04d571d02828e5031de67d25fa\": container with ID starting with c84bff04e84df7be4ab4bda2523e4f67142dfa04d571d02828e5031de67d25fa not found: ID does not exist" Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.871592 4775 scope.go:117] "RemoveContainer" containerID="f9cba3196e7d1a95c215ced4ff64feb09097d6a281f7106e2e38a5ed948dd77b" Dec 16 15:20:32 crc kubenswrapper[4775]: E1216 15:20:32.871907 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9cba3196e7d1a95c215ced4ff64feb09097d6a281f7106e2e38a5ed948dd77b\": container with ID starting with f9cba3196e7d1a95c215ced4ff64feb09097d6a281f7106e2e38a5ed948dd77b not found: ID does not exist" containerID="f9cba3196e7d1a95c215ced4ff64feb09097d6a281f7106e2e38a5ed948dd77b" Dec 16 15:20:32 crc kubenswrapper[4775]: I1216 15:20:32.871957 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9cba3196e7d1a95c215ced4ff64feb09097d6a281f7106e2e38a5ed948dd77b"} err="failed to get container status \"f9cba3196e7d1a95c215ced4ff64feb09097d6a281f7106e2e38a5ed948dd77b\": rpc error: code = NotFound desc = could not find container \"f9cba3196e7d1a95c215ced4ff64feb09097d6a281f7106e2e38a5ed948dd77b\": container with ID starting with f9cba3196e7d1a95c215ced4ff64feb09097d6a281f7106e2e38a5ed948dd77b not found: ID does not exist" Dec 16 15:20:33 crc kubenswrapper[4775]: I1216 15:20:33.352670 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b0e3803-78d9-421b-8268-6fb0c05280c5" path="/var/lib/kubelet/pods/5b0e3803-78d9-421b-8268-6fb0c05280c5/volumes" Dec 16 15:20:50 crc kubenswrapper[4775]: I1216 15:20:50.090262 4775 scope.go:117] "RemoveContainer" containerID="d84554a5c40da4428974694dcc9e3f561c5cfade6fd51ad035f0b93b7ca513c5" Dec 16 15:21:05 crc kubenswrapper[4775]: I1216 15:21:05.358459 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-flmvj"] Dec 16 15:21:05 crc kubenswrapper[4775]: E1216 15:21:05.359400 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0e3803-78d9-421b-8268-6fb0c05280c5" containerName="extract-utilities" Dec 16 15:21:05 crc kubenswrapper[4775]: I1216 15:21:05.359415 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0e3803-78d9-421b-8268-6fb0c05280c5" containerName="extract-utilities" Dec 16 15:21:05 crc kubenswrapper[4775]: E1216 15:21:05.359444 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0e3803-78d9-421b-8268-6fb0c05280c5" containerName="extract-content" Dec 16 15:21:05 crc kubenswrapper[4775]: I1216 15:21:05.359451 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0e3803-78d9-421b-8268-6fb0c05280c5" containerName="extract-content" Dec 16 15:21:05 crc kubenswrapper[4775]: E1216 15:21:05.359461 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0e3803-78d9-421b-8268-6fb0c05280c5" containerName="registry-server" Dec 16 15:21:05 crc kubenswrapper[4775]: I1216 15:21:05.359466 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0e3803-78d9-421b-8268-6fb0c05280c5" containerName="registry-server" Dec 16 15:21:05 crc kubenswrapper[4775]: I1216 15:21:05.359652 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b0e3803-78d9-421b-8268-6fb0c05280c5" containerName="registry-server" Dec 16 15:21:05 crc kubenswrapper[4775]: I1216 15:21:05.361146 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-flmvj" Dec 16 15:21:05 crc kubenswrapper[4775]: I1216 15:21:05.389587 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-flmvj"] Dec 16 15:21:05 crc kubenswrapper[4775]: I1216 15:21:05.545945 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a478dc87-5ea8-4989-af80-668548c76d4c-utilities\") pod \"community-operators-flmvj\" (UID: \"a478dc87-5ea8-4989-af80-668548c76d4c\") " pod="openshift-marketplace/community-operators-flmvj" Dec 16 15:21:05 crc kubenswrapper[4775]: I1216 15:21:05.545991 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a478dc87-5ea8-4989-af80-668548c76d4c-catalog-content\") pod \"community-operators-flmvj\" (UID: \"a478dc87-5ea8-4989-af80-668548c76d4c\") " pod="openshift-marketplace/community-operators-flmvj" Dec 16 15:21:05 crc kubenswrapper[4775]: I1216 15:21:05.546028 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4pzq\" (UniqueName: \"kubernetes.io/projected/a478dc87-5ea8-4989-af80-668548c76d4c-kube-api-access-x4pzq\") pod \"community-operators-flmvj\" (UID: \"a478dc87-5ea8-4989-af80-668548c76d4c\") " pod="openshift-marketplace/community-operators-flmvj" Dec 16 15:21:05 crc kubenswrapper[4775]: I1216 15:21:05.647764 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a478dc87-5ea8-4989-af80-668548c76d4c-utilities\") pod \"community-operators-flmvj\" (UID: \"a478dc87-5ea8-4989-af80-668548c76d4c\") " pod="openshift-marketplace/community-operators-flmvj" Dec 16 15:21:05 crc kubenswrapper[4775]: I1216 15:21:05.647831 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a478dc87-5ea8-4989-af80-668548c76d4c-catalog-content\") pod \"community-operators-flmvj\" (UID: \"a478dc87-5ea8-4989-af80-668548c76d4c\") " pod="openshift-marketplace/community-operators-flmvj" Dec 16 15:21:05 crc kubenswrapper[4775]: I1216 15:21:05.647877 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4pzq\" (UniqueName: \"kubernetes.io/projected/a478dc87-5ea8-4989-af80-668548c76d4c-kube-api-access-x4pzq\") pod \"community-operators-flmvj\" (UID: \"a478dc87-5ea8-4989-af80-668548c76d4c\") " pod="openshift-marketplace/community-operators-flmvj" Dec 16 15:21:05 crc kubenswrapper[4775]: I1216 15:21:05.648484 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a478dc87-5ea8-4989-af80-668548c76d4c-utilities\") pod \"community-operators-flmvj\" (UID: \"a478dc87-5ea8-4989-af80-668548c76d4c\") " pod="openshift-marketplace/community-operators-flmvj" Dec 16 15:21:05 crc kubenswrapper[4775]: I1216 15:21:05.648551 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a478dc87-5ea8-4989-af80-668548c76d4c-catalog-content\") pod \"community-operators-flmvj\" (UID: \"a478dc87-5ea8-4989-af80-668548c76d4c\") " pod="openshift-marketplace/community-operators-flmvj" Dec 16 15:21:05 crc kubenswrapper[4775]: I1216 15:21:05.686006 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4pzq\" (UniqueName: \"kubernetes.io/projected/a478dc87-5ea8-4989-af80-668548c76d4c-kube-api-access-x4pzq\") pod \"community-operators-flmvj\" (UID: \"a478dc87-5ea8-4989-af80-668548c76d4c\") " pod="openshift-marketplace/community-operators-flmvj" Dec 16 15:21:05 crc kubenswrapper[4775]: I1216 15:21:05.696170 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-flmvj" Dec 16 15:21:06 crc kubenswrapper[4775]: I1216 15:21:06.190632 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-flmvj"] Dec 16 15:21:07 crc kubenswrapper[4775]: I1216 15:21:07.068325 4775 generic.go:334] "Generic (PLEG): container finished" podID="a478dc87-5ea8-4989-af80-668548c76d4c" containerID="db6cc4ce947be66de09f678dfe401819773228ab23e21760e5df9192f21775a6" exitCode=0 Dec 16 15:21:07 crc kubenswrapper[4775]: I1216 15:21:07.068372 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flmvj" event={"ID":"a478dc87-5ea8-4989-af80-668548c76d4c","Type":"ContainerDied","Data":"db6cc4ce947be66de09f678dfe401819773228ab23e21760e5df9192f21775a6"} Dec 16 15:21:07 crc kubenswrapper[4775]: I1216 15:21:07.068688 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flmvj" event={"ID":"a478dc87-5ea8-4989-af80-668548c76d4c","Type":"ContainerStarted","Data":"f7a0ef7795f4b376ff13863b36fc829d0e737f494f7ef07ab2b4c8527d1c5707"} Dec 16 15:21:09 crc kubenswrapper[4775]: I1216 15:21:09.087667 4775 generic.go:334] "Generic (PLEG): container finished" podID="a478dc87-5ea8-4989-af80-668548c76d4c" containerID="86ec5c98b6e249f8935084d1423bc262013ad557d8f0be1c9f12e31c02b60fbe" exitCode=0 Dec 16 15:21:09 crc kubenswrapper[4775]: I1216 15:21:09.087739 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flmvj" event={"ID":"a478dc87-5ea8-4989-af80-668548c76d4c","Type":"ContainerDied","Data":"86ec5c98b6e249f8935084d1423bc262013ad557d8f0be1c9f12e31c02b60fbe"} Dec 16 15:21:10 crc kubenswrapper[4775]: I1216 15:21:10.101027 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flmvj" event={"ID":"a478dc87-5ea8-4989-af80-668548c76d4c","Type":"ContainerStarted","Data":"291b6007d5254ea6abdf89e5b4b3b1f10991481c22e777f9927eb624a7ef58f7"} Dec 16 15:21:10 crc kubenswrapper[4775]: I1216 15:21:10.122735 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-flmvj" podStartSLOduration=2.3945341989999998 podStartE2EDuration="5.122718584s" podCreationTimestamp="2025-12-16 15:21:05 +0000 UTC" firstStartedPulling="2025-12-16 15:21:07.070086436 +0000 UTC m=+1592.021165359" lastFinishedPulling="2025-12-16 15:21:09.798270821 +0000 UTC m=+1594.749349744" observedRunningTime="2025-12-16 15:21:10.116362292 +0000 UTC m=+1595.067441215" watchObservedRunningTime="2025-12-16 15:21:10.122718584 +0000 UTC m=+1595.073797507" Dec 16 15:21:15 crc kubenswrapper[4775]: I1216 15:21:15.696685 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-flmvj" Dec 16 15:21:15 crc kubenswrapper[4775]: I1216 15:21:15.697428 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-flmvj" Dec 16 15:21:15 crc kubenswrapper[4775]: I1216 15:21:15.748661 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-flmvj" Dec 16 15:21:16 crc kubenswrapper[4775]: I1216 15:21:16.233245 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-flmvj" Dec 16 15:21:16 crc kubenswrapper[4775]: I1216 15:21:16.306756 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-flmvj"] Dec 16 15:21:18 crc kubenswrapper[4775]: I1216 15:21:18.207418 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-flmvj" podUID="a478dc87-5ea8-4989-af80-668548c76d4c" containerName="registry-server" containerID="cri-o://291b6007d5254ea6abdf89e5b4b3b1f10991481c22e777f9927eb624a7ef58f7" gracePeriod=2 Dec 16 15:21:18 crc kubenswrapper[4775]: I1216 15:21:18.704343 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-flmvj" Dec 16 15:21:18 crc kubenswrapper[4775]: I1216 15:21:18.804221 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a478dc87-5ea8-4989-af80-668548c76d4c-catalog-content\") pod \"a478dc87-5ea8-4989-af80-668548c76d4c\" (UID: \"a478dc87-5ea8-4989-af80-668548c76d4c\") " Dec 16 15:21:18 crc kubenswrapper[4775]: I1216 15:21:18.804278 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4pzq\" (UniqueName: \"kubernetes.io/projected/a478dc87-5ea8-4989-af80-668548c76d4c-kube-api-access-x4pzq\") pod \"a478dc87-5ea8-4989-af80-668548c76d4c\" (UID: \"a478dc87-5ea8-4989-af80-668548c76d4c\") " Dec 16 15:21:18 crc kubenswrapper[4775]: I1216 15:21:18.804354 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a478dc87-5ea8-4989-af80-668548c76d4c-utilities\") pod \"a478dc87-5ea8-4989-af80-668548c76d4c\" (UID: \"a478dc87-5ea8-4989-af80-668548c76d4c\") " Dec 16 15:21:18 crc kubenswrapper[4775]: I1216 15:21:18.805649 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a478dc87-5ea8-4989-af80-668548c76d4c-utilities" (OuterVolumeSpecName: "utilities") pod "a478dc87-5ea8-4989-af80-668548c76d4c" (UID: "a478dc87-5ea8-4989-af80-668548c76d4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:21:18 crc kubenswrapper[4775]: I1216 15:21:18.810492 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a478dc87-5ea8-4989-af80-668548c76d4c-kube-api-access-x4pzq" (OuterVolumeSpecName: "kube-api-access-x4pzq") pod "a478dc87-5ea8-4989-af80-668548c76d4c" (UID: "a478dc87-5ea8-4989-af80-668548c76d4c"). InnerVolumeSpecName "kube-api-access-x4pzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:21:18 crc kubenswrapper[4775]: I1216 15:21:18.858618 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a478dc87-5ea8-4989-af80-668548c76d4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a478dc87-5ea8-4989-af80-668548c76d4c" (UID: "a478dc87-5ea8-4989-af80-668548c76d4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:21:18 crc kubenswrapper[4775]: I1216 15:21:18.907048 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a478dc87-5ea8-4989-af80-668548c76d4c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:21:18 crc kubenswrapper[4775]: I1216 15:21:18.907076 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4pzq\" (UniqueName: \"kubernetes.io/projected/a478dc87-5ea8-4989-af80-668548c76d4c-kube-api-access-x4pzq\") on node \"crc\" DevicePath \"\"" Dec 16 15:21:18 crc kubenswrapper[4775]: I1216 15:21:18.907086 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a478dc87-5ea8-4989-af80-668548c76d4c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:21:19 crc kubenswrapper[4775]: I1216 15:21:19.218326 4775 generic.go:334] "Generic (PLEG): container finished" podID="a478dc87-5ea8-4989-af80-668548c76d4c" containerID="291b6007d5254ea6abdf89e5b4b3b1f10991481c22e777f9927eb624a7ef58f7" exitCode=0 Dec 16 15:21:19 crc kubenswrapper[4775]: I1216 15:21:19.218387 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flmvj" event={"ID":"a478dc87-5ea8-4989-af80-668548c76d4c","Type":"ContainerDied","Data":"291b6007d5254ea6abdf89e5b4b3b1f10991481c22e777f9927eb624a7ef58f7"} Dec 16 15:21:19 crc kubenswrapper[4775]: I1216 15:21:19.218400 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-flmvj" Dec 16 15:21:19 crc kubenswrapper[4775]: I1216 15:21:19.218428 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flmvj" event={"ID":"a478dc87-5ea8-4989-af80-668548c76d4c","Type":"ContainerDied","Data":"f7a0ef7795f4b376ff13863b36fc829d0e737f494f7ef07ab2b4c8527d1c5707"} Dec 16 15:21:19 crc kubenswrapper[4775]: I1216 15:21:19.218451 4775 scope.go:117] "RemoveContainer" containerID="291b6007d5254ea6abdf89e5b4b3b1f10991481c22e777f9927eb624a7ef58f7" Dec 16 15:21:19 crc kubenswrapper[4775]: I1216 15:21:19.239904 4775 scope.go:117] "RemoveContainer" containerID="86ec5c98b6e249f8935084d1423bc262013ad557d8f0be1c9f12e31c02b60fbe" Dec 16 15:21:19 crc kubenswrapper[4775]: I1216 15:21:19.252993 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-flmvj"] Dec 16 15:21:19 crc kubenswrapper[4775]: I1216 15:21:19.260661 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-flmvj"] Dec 16 15:21:19 crc kubenswrapper[4775]: I1216 15:21:19.281533 4775 scope.go:117] "RemoveContainer" containerID="db6cc4ce947be66de09f678dfe401819773228ab23e21760e5df9192f21775a6" Dec 16 15:21:19 crc kubenswrapper[4775]: I1216 15:21:19.312056 4775 scope.go:117] "RemoveContainer" containerID="291b6007d5254ea6abdf89e5b4b3b1f10991481c22e777f9927eb624a7ef58f7" Dec 16 15:21:19 crc kubenswrapper[4775]: E1216 15:21:19.312505 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"291b6007d5254ea6abdf89e5b4b3b1f10991481c22e777f9927eb624a7ef58f7\": container with ID starting with 291b6007d5254ea6abdf89e5b4b3b1f10991481c22e777f9927eb624a7ef58f7 not found: ID does not exist" containerID="291b6007d5254ea6abdf89e5b4b3b1f10991481c22e777f9927eb624a7ef58f7" Dec 16 15:21:19 crc kubenswrapper[4775]: I1216 15:21:19.312546 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"291b6007d5254ea6abdf89e5b4b3b1f10991481c22e777f9927eb624a7ef58f7"} err="failed to get container status \"291b6007d5254ea6abdf89e5b4b3b1f10991481c22e777f9927eb624a7ef58f7\": rpc error: code = NotFound desc = could not find container \"291b6007d5254ea6abdf89e5b4b3b1f10991481c22e777f9927eb624a7ef58f7\": container with ID starting with 291b6007d5254ea6abdf89e5b4b3b1f10991481c22e777f9927eb624a7ef58f7 not found: ID does not exist" Dec 16 15:21:19 crc kubenswrapper[4775]: I1216 15:21:19.312574 4775 scope.go:117] "RemoveContainer" containerID="86ec5c98b6e249f8935084d1423bc262013ad557d8f0be1c9f12e31c02b60fbe" Dec 16 15:21:19 crc kubenswrapper[4775]: E1216 15:21:19.313040 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86ec5c98b6e249f8935084d1423bc262013ad557d8f0be1c9f12e31c02b60fbe\": container with ID starting with 86ec5c98b6e249f8935084d1423bc262013ad557d8f0be1c9f12e31c02b60fbe not found: ID does not exist" containerID="86ec5c98b6e249f8935084d1423bc262013ad557d8f0be1c9f12e31c02b60fbe" Dec 16 15:21:19 crc kubenswrapper[4775]: I1216 15:21:19.313345 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86ec5c98b6e249f8935084d1423bc262013ad557d8f0be1c9f12e31c02b60fbe"} err="failed to get container status \"86ec5c98b6e249f8935084d1423bc262013ad557d8f0be1c9f12e31c02b60fbe\": rpc error: code = NotFound desc = could not find container \"86ec5c98b6e249f8935084d1423bc262013ad557d8f0be1c9f12e31c02b60fbe\": container with ID starting with 86ec5c98b6e249f8935084d1423bc262013ad557d8f0be1c9f12e31c02b60fbe not found: ID does not exist" Dec 16 15:21:19 crc kubenswrapper[4775]: I1216 15:21:19.313400 4775 scope.go:117] "RemoveContainer" containerID="db6cc4ce947be66de09f678dfe401819773228ab23e21760e5df9192f21775a6" Dec 16 15:21:19 crc kubenswrapper[4775]: E1216 15:21:19.313708 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db6cc4ce947be66de09f678dfe401819773228ab23e21760e5df9192f21775a6\": container with ID starting with db6cc4ce947be66de09f678dfe401819773228ab23e21760e5df9192f21775a6 not found: ID does not exist" containerID="db6cc4ce947be66de09f678dfe401819773228ab23e21760e5df9192f21775a6" Dec 16 15:21:19 crc kubenswrapper[4775]: I1216 15:21:19.313742 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db6cc4ce947be66de09f678dfe401819773228ab23e21760e5df9192f21775a6"} err="failed to get container status \"db6cc4ce947be66de09f678dfe401819773228ab23e21760e5df9192f21775a6\": rpc error: code = NotFound desc = could not find container \"db6cc4ce947be66de09f678dfe401819773228ab23e21760e5df9192f21775a6\": container with ID starting with db6cc4ce947be66de09f678dfe401819773228ab23e21760e5df9192f21775a6 not found: ID does not exist" Dec 16 15:21:19 crc kubenswrapper[4775]: I1216 15:21:19.347993 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a478dc87-5ea8-4989-af80-668548c76d4c" path="/var/lib/kubelet/pods/a478dc87-5ea8-4989-af80-668548c76d4c/volumes" Dec 16 15:21:32 crc kubenswrapper[4775]: I1216 15:21:32.869435 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:21:32 crc kubenswrapper[4775]: I1216 15:21:32.870164 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:21:50 crc kubenswrapper[4775]: I1216 15:21:50.180457 4775 scope.go:117] "RemoveContainer" containerID="fe794b90b9faf3f59c87057ca9919d556ad0cedd8e3a756fca5550c4d7873bd2" Dec 16 15:21:50 crc kubenswrapper[4775]: I1216 15:21:50.213940 4775 scope.go:117] "RemoveContainer" containerID="d5e60c2d53aec878da10d04765a76a1e18632d81605d744e1e12e0afc666286f" Dec 16 15:21:50 crc kubenswrapper[4775]: I1216 15:21:50.255869 4775 scope.go:117] "RemoveContainer" containerID="dee480379df7d5b47227646e7ab12f025e7864848865937100bcbbf6bb34b08b" Dec 16 15:22:02 crc kubenswrapper[4775]: I1216 15:22:02.868730 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:22:02 crc kubenswrapper[4775]: I1216 15:22:02.869366 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:22:32 crc kubenswrapper[4775]: I1216 15:22:32.869363 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:22:32 crc kubenswrapper[4775]: I1216 15:22:32.869808 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:22:32 crc kubenswrapper[4775]: I1216 15:22:32.869849 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 15:22:32 crc kubenswrapper[4775]: I1216 15:22:32.870527 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda"} pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:22:32 crc kubenswrapper[4775]: I1216 15:22:32.870581 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" containerID="cri-o://0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" gracePeriod=600 Dec 16 15:22:33 crc kubenswrapper[4775]: E1216 15:22:33.547764 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:22:33 crc kubenswrapper[4775]: I1216 15:22:33.938163 4775 generic.go:334] "Generic (PLEG): container finished" podID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" exitCode=0 Dec 16 15:22:33 crc kubenswrapper[4775]: I1216 15:22:33.938186 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerDied","Data":"0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda"} Dec 16 15:22:33 crc kubenswrapper[4775]: I1216 15:22:33.938552 4775 scope.go:117] "RemoveContainer" containerID="191baaa15580fc980936d4ebfb4d77ed829d99816880e694e3e02bd3ec00e6a9" Dec 16 15:22:33 crc kubenswrapper[4775]: I1216 15:22:33.939274 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:22:33 crc kubenswrapper[4775]: E1216 15:22:33.939545 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:22:47 crc kubenswrapper[4775]: I1216 15:22:47.338357 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:22:47 crc kubenswrapper[4775]: E1216 15:22:47.339237 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:22:50 crc kubenswrapper[4775]: I1216 15:22:50.360297 4775 scope.go:117] "RemoveContainer" containerID="8906e4b9e85b5b63b8682c82f8bb1a39eecc76a007eefd56a8310c2c1fc594a8" Dec 16 15:22:50 crc kubenswrapper[4775]: I1216 15:22:50.384794 4775 scope.go:117] "RemoveContainer" containerID="7b8212cb594a5b8536189bbaf23f255279bcdcd2f1ef535fd2dc115dc6a2dccf" Dec 16 15:22:58 crc kubenswrapper[4775]: I1216 15:22:58.337690 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:22:58 crc kubenswrapper[4775]: E1216 15:22:58.338466 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:23:12 crc kubenswrapper[4775]: I1216 15:23:12.344664 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:23:12 crc kubenswrapper[4775]: E1216 15:23:12.345528 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:23:26 crc kubenswrapper[4775]: I1216 15:23:26.338402 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:23:26 crc kubenswrapper[4775]: E1216 15:23:26.339165 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:23:37 crc kubenswrapper[4775]: I1216 15:23:37.338461 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:23:37 crc kubenswrapper[4775]: E1216 15:23:37.339223 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:23:50 crc kubenswrapper[4775]: I1216 15:23:50.466274 4775 scope.go:117] "RemoveContainer" containerID="b7b68e3a0262a8a8082c4920ba9713c9becf917dec8a90cb506ae627478b6193" Dec 16 15:23:50 crc kubenswrapper[4775]: I1216 15:23:50.495982 4775 scope.go:117] "RemoveContainer" containerID="9bcaaec2564cb5ffff7898286e099b3dd73deb6bc52cd3d3dd77958648eb0768" Dec 16 15:23:50 crc kubenswrapper[4775]: I1216 15:23:50.518211 4775 scope.go:117] "RemoveContainer" containerID="42f8e8607b50b5848bd7c3b9711fb75e4ff0c4b9673e45672b0426d5675c457e" Dec 16 15:23:52 crc kubenswrapper[4775]: I1216 15:23:52.337809 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:23:52 crc kubenswrapper[4775]: E1216 15:23:52.338422 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:23:56 crc kubenswrapper[4775]: I1216 15:23:56.101588 4775 generic.go:334] "Generic (PLEG): container finished" podID="2b2d1ae7-ec42-4c6c-9400-966f2093d883" containerID="a79f29f24f63d17a44c8612e7677cd623f4c59760efb9cce071259f6e0a93f8b" exitCode=0 Dec 16 15:23:56 crc kubenswrapper[4775]: I1216 15:23:56.101697 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" event={"ID":"2b2d1ae7-ec42-4c6c-9400-966f2093d883","Type":"ContainerDied","Data":"a79f29f24f63d17a44c8612e7677cd623f4c59760efb9cce071259f6e0a93f8b"} Dec 16 15:23:57 crc kubenswrapper[4775]: I1216 15:23:57.515733 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" Dec 16 15:23:57 crc kubenswrapper[4775]: I1216 15:23:57.639840 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b2d1ae7-ec42-4c6c-9400-966f2093d883-bootstrap-combined-ca-bundle\") pod \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\" (UID: \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\") " Dec 16 15:23:57 crc kubenswrapper[4775]: I1216 15:23:57.640104 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b2d1ae7-ec42-4c6c-9400-966f2093d883-inventory\") pod \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\" (UID: \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\") " Dec 16 15:23:57 crc kubenswrapper[4775]: I1216 15:23:57.640247 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b2d1ae7-ec42-4c6c-9400-966f2093d883-ssh-key\") pod \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\" (UID: \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\") " Dec 16 15:23:57 crc kubenswrapper[4775]: I1216 15:23:57.640369 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq2mf\" (UniqueName: \"kubernetes.io/projected/2b2d1ae7-ec42-4c6c-9400-966f2093d883-kube-api-access-bq2mf\") pod \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\" (UID: \"2b2d1ae7-ec42-4c6c-9400-966f2093d883\") " Dec 16 15:23:57 crc kubenswrapper[4775]: I1216 15:23:57.645724 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b2d1ae7-ec42-4c6c-9400-966f2093d883-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2b2d1ae7-ec42-4c6c-9400-966f2093d883" (UID: "2b2d1ae7-ec42-4c6c-9400-966f2093d883"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:23:57 crc kubenswrapper[4775]: I1216 15:23:57.645775 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b2d1ae7-ec42-4c6c-9400-966f2093d883-kube-api-access-bq2mf" (OuterVolumeSpecName: "kube-api-access-bq2mf") pod "2b2d1ae7-ec42-4c6c-9400-966f2093d883" (UID: "2b2d1ae7-ec42-4c6c-9400-966f2093d883"). InnerVolumeSpecName "kube-api-access-bq2mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:23:57 crc kubenswrapper[4775]: I1216 15:23:57.668532 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b2d1ae7-ec42-4c6c-9400-966f2093d883-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2b2d1ae7-ec42-4c6c-9400-966f2093d883" (UID: "2b2d1ae7-ec42-4c6c-9400-966f2093d883"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:23:57 crc kubenswrapper[4775]: I1216 15:23:57.671261 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b2d1ae7-ec42-4c6c-9400-966f2093d883-inventory" (OuterVolumeSpecName: "inventory") pod "2b2d1ae7-ec42-4c6c-9400-966f2093d883" (UID: "2b2d1ae7-ec42-4c6c-9400-966f2093d883"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:23:57 crc kubenswrapper[4775]: I1216 15:23:57.742665 4775 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b2d1ae7-ec42-4c6c-9400-966f2093d883-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:23:57 crc kubenswrapper[4775]: I1216 15:23:57.742700 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b2d1ae7-ec42-4c6c-9400-966f2093d883-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:23:57 crc kubenswrapper[4775]: I1216 15:23:57.742710 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b2d1ae7-ec42-4c6c-9400-966f2093d883-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:23:57 crc kubenswrapper[4775]: I1216 15:23:57.742719 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq2mf\" (UniqueName: \"kubernetes.io/projected/2b2d1ae7-ec42-4c6c-9400-966f2093d883-kube-api-access-bq2mf\") on node \"crc\" DevicePath \"\"" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.121613 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" event={"ID":"2b2d1ae7-ec42-4c6c-9400-966f2093d883","Type":"ContainerDied","Data":"15bfd1fb2bdd4d7c66866608a4fc70839f7e271d750a2919aa4201500b8c31a8"} Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.121662 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15bfd1fb2bdd4d7c66866608a4fc70839f7e271d750a2919aa4201500b8c31a8" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.121714 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.228714 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp"] Dec 16 15:23:58 crc kubenswrapper[4775]: E1216 15:23:58.229374 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a478dc87-5ea8-4989-af80-668548c76d4c" containerName="registry-server" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.229402 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a478dc87-5ea8-4989-af80-668548c76d4c" containerName="registry-server" Dec 16 15:23:58 crc kubenswrapper[4775]: E1216 15:23:58.229435 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a478dc87-5ea8-4989-af80-668548c76d4c" containerName="extract-content" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.229444 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a478dc87-5ea8-4989-af80-668548c76d4c" containerName="extract-content" Dec 16 15:23:58 crc kubenswrapper[4775]: E1216 15:23:58.229474 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a478dc87-5ea8-4989-af80-668548c76d4c" containerName="extract-utilities" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.229484 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a478dc87-5ea8-4989-af80-668548c76d4c" containerName="extract-utilities" Dec 16 15:23:58 crc kubenswrapper[4775]: E1216 15:23:58.229507 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b2d1ae7-ec42-4c6c-9400-966f2093d883" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.229520 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b2d1ae7-ec42-4c6c-9400-966f2093d883" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.229775 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a478dc87-5ea8-4989-af80-668548c76d4c" containerName="registry-server" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.229819 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b2d1ae7-ec42-4c6c-9400-966f2093d883" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.230830 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.233594 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.233767 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tgv5f" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.233922 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.234067 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.241597 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp"] Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.353183 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fkrj\" (UniqueName: \"kubernetes.io/projected/32966f09-4e16-4fcb-925e-edb1c957cea1-kube-api-access-5fkrj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp\" (UID: \"32966f09-4e16-4fcb-925e-edb1c957cea1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.353248 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32966f09-4e16-4fcb-925e-edb1c957cea1-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp\" (UID: \"32966f09-4e16-4fcb-925e-edb1c957cea1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.353447 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32966f09-4e16-4fcb-925e-edb1c957cea1-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp\" (UID: \"32966f09-4e16-4fcb-925e-edb1c957cea1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.455309 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32966f09-4e16-4fcb-925e-edb1c957cea1-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp\" (UID: \"32966f09-4e16-4fcb-925e-edb1c957cea1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.455461 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fkrj\" (UniqueName: \"kubernetes.io/projected/32966f09-4e16-4fcb-925e-edb1c957cea1-kube-api-access-5fkrj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp\" (UID: \"32966f09-4e16-4fcb-925e-edb1c957cea1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.455568 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32966f09-4e16-4fcb-925e-edb1c957cea1-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp\" (UID: \"32966f09-4e16-4fcb-925e-edb1c957cea1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.460941 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32966f09-4e16-4fcb-925e-edb1c957cea1-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp\" (UID: \"32966f09-4e16-4fcb-925e-edb1c957cea1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.461022 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32966f09-4e16-4fcb-925e-edb1c957cea1-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp\" (UID: \"32966f09-4e16-4fcb-925e-edb1c957cea1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.479235 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fkrj\" (UniqueName: \"kubernetes.io/projected/32966f09-4e16-4fcb-925e-edb1c957cea1-kube-api-access-5fkrj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp\" (UID: \"32966f09-4e16-4fcb-925e-edb1c957cea1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp" Dec 16 15:23:58 crc kubenswrapper[4775]: I1216 15:23:58.554207 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp" Dec 16 15:23:59 crc kubenswrapper[4775]: I1216 15:23:59.102549 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp"] Dec 16 15:23:59 crc kubenswrapper[4775]: I1216 15:23:59.106513 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 15:23:59 crc kubenswrapper[4775]: I1216 15:23:59.130925 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp" event={"ID":"32966f09-4e16-4fcb-925e-edb1c957cea1","Type":"ContainerStarted","Data":"3c73ffb426eb1b60b99ede8ce14a72ad9d585a008c21db8c9c5bda1877643f89"} Dec 16 15:24:00 crc kubenswrapper[4775]: I1216 15:24:00.141896 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp" event={"ID":"32966f09-4e16-4fcb-925e-edb1c957cea1","Type":"ContainerStarted","Data":"b25349b770122ce06b00c8a9be3324c781a94c1c9bafc65093af56ffb03c99fe"} Dec 16 15:24:00 crc kubenswrapper[4775]: I1216 15:24:00.164906 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp" podStartSLOduration=1.582569793 podStartE2EDuration="2.164871685s" podCreationTimestamp="2025-12-16 15:23:58 +0000 UTC" firstStartedPulling="2025-12-16 15:23:59.106316488 +0000 UTC m=+1764.057395411" lastFinishedPulling="2025-12-16 15:23:59.68861838 +0000 UTC m=+1764.639697303" observedRunningTime="2025-12-16 15:24:00.161000372 +0000 UTC m=+1765.112079315" watchObservedRunningTime="2025-12-16 15:24:00.164871685 +0000 UTC m=+1765.115950608" Dec 16 15:24:03 crc kubenswrapper[4775]: I1216 15:24:03.339522 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:24:03 crc kubenswrapper[4775]: E1216 15:24:03.340409 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:24:15 crc kubenswrapper[4775]: I1216 15:24:15.345341 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:24:15 crc kubenswrapper[4775]: E1216 15:24:15.346437 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:24:29 crc kubenswrapper[4775]: I1216 15:24:29.045590 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-4fce-account-create-update-8h6sf"] Dec 16 15:24:29 crc kubenswrapper[4775]: I1216 15:24:29.055115 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-4fce-account-create-update-8h6sf"] Dec 16 15:24:29 crc kubenswrapper[4775]: I1216 15:24:29.355686 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f018ae8-2c5c-41f5-a7e5-be48d695db30" path="/var/lib/kubelet/pods/9f018ae8-2c5c-41f5-a7e5-be48d695db30/volumes" Dec 16 15:24:30 crc kubenswrapper[4775]: I1216 15:24:30.042411 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6a2c-account-create-update-l9k8g"] Dec 16 15:24:30 crc kubenswrapper[4775]: I1216 15:24:30.053651 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-mxzpt"] Dec 16 15:24:30 crc kubenswrapper[4775]: I1216 15:24:30.079147 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-rc6p4"] Dec 16 15:24:30 crc kubenswrapper[4775]: I1216 15:24:30.093188 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5251-account-create-update-fmcxk"] Dec 16 15:24:30 crc kubenswrapper[4775]: I1216 15:24:30.103221 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6a2c-account-create-update-l9k8g"] Dec 16 15:24:30 crc kubenswrapper[4775]: I1216 15:24:30.113204 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-jfprw"] Dec 16 15:24:30 crc kubenswrapper[4775]: I1216 15:24:30.123019 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-mxzpt"] Dec 16 15:24:30 crc kubenswrapper[4775]: I1216 15:24:30.132102 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-rc6p4"] Dec 16 15:24:30 crc kubenswrapper[4775]: I1216 15:24:30.140982 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5251-account-create-update-fmcxk"] Dec 16 15:24:30 crc kubenswrapper[4775]: I1216 15:24:30.151784 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-jfprw"] Dec 16 15:24:30 crc kubenswrapper[4775]: I1216 15:24:30.338551 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:24:30 crc kubenswrapper[4775]: E1216 15:24:30.339194 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:24:31 crc kubenswrapper[4775]: I1216 15:24:31.352072 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4078625a-a3c7-45f0-85e1-56c07c7b85b9" path="/var/lib/kubelet/pods/4078625a-a3c7-45f0-85e1-56c07c7b85b9/volumes" Dec 16 15:24:31 crc kubenswrapper[4775]: I1216 15:24:31.353013 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41974328-a6d3-4c9f-af51-edd0faa04b5a" path="/var/lib/kubelet/pods/41974328-a6d3-4c9f-af51-edd0faa04b5a/volumes" Dec 16 15:24:31 crc kubenswrapper[4775]: I1216 15:24:31.353856 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="735a5a7b-bafc-467b-9fa0-9a6755f9f04c" path="/var/lib/kubelet/pods/735a5a7b-bafc-467b-9fa0-9a6755f9f04c/volumes" Dec 16 15:24:31 crc kubenswrapper[4775]: I1216 15:24:31.354736 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afe8ec15-4c89-4cf1-a6b7-5e3534660d7b" path="/var/lib/kubelet/pods/afe8ec15-4c89-4cf1-a6b7-5e3534660d7b/volumes" Dec 16 15:24:31 crc kubenswrapper[4775]: I1216 15:24:31.356507 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc656f33-c012-450b-b263-81b4375a3d58" path="/var/lib/kubelet/pods/bc656f33-c012-450b-b263-81b4375a3d58/volumes" Dec 16 15:24:43 crc kubenswrapper[4775]: I1216 15:24:43.337629 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:24:43 crc kubenswrapper[4775]: E1216 15:24:43.338424 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:24:50 crc kubenswrapper[4775]: I1216 15:24:50.592967 4775 scope.go:117] "RemoveContainer" containerID="f949ba9f03f7fdcf9c5179f6be2903cc3b7f7a73b24900acca12da063e1390f0" Dec 16 15:24:50 crc kubenswrapper[4775]: I1216 15:24:50.619416 4775 scope.go:117] "RemoveContainer" containerID="09e2649ac93ec514357db06eaa4b993541638b063d5c2d1c90e62105d892ccfe" Dec 16 15:24:50 crc kubenswrapper[4775]: I1216 15:24:50.670912 4775 scope.go:117] "RemoveContainer" containerID="c26c455a750f5de4ed93a47c05e72cb5218308a1eb988aeafcaaf08b9a362479" Dec 16 15:24:50 crc kubenswrapper[4775]: I1216 15:24:50.713220 4775 scope.go:117] "RemoveContainer" containerID="91014cd1d19cc2aa60b3956d63e5e9bf85784cf5d2b17449a249b98dfccb5c5f" Dec 16 15:24:50 crc kubenswrapper[4775]: I1216 15:24:50.757017 4775 scope.go:117] "RemoveContainer" containerID="b425b98f1568c51e1d08a64b8b7618011cd885d6be9f44f576b060412bc5caac" Dec 16 15:24:50 crc kubenswrapper[4775]: I1216 15:24:50.802342 4775 scope.go:117] "RemoveContainer" containerID="1cfc513e3134d62bc62418cd1ff86ffb049bdbd4b1c16aae98aaf777cf74bada" Dec 16 15:24:52 crc kubenswrapper[4775]: I1216 15:24:52.046501 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-tj4s6"] Dec 16 15:24:52 crc kubenswrapper[4775]: I1216 15:24:52.055722 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-tj4s6"] Dec 16 15:24:53 crc kubenswrapper[4775]: I1216 15:24:53.347576 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e1160f7-8d33-49ce-a91c-7c4e95335f49" path="/var/lib/kubelet/pods/8e1160f7-8d33-49ce-a91c-7c4e95335f49/volumes" Dec 16 15:24:54 crc kubenswrapper[4775]: I1216 15:24:54.034176 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-7skc8"] Dec 16 15:24:54 crc kubenswrapper[4775]: I1216 15:24:54.046828 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-7skc8"] Dec 16 15:24:54 crc kubenswrapper[4775]: I1216 15:24:54.062288 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-aacd-account-create-update-swcfz"] Dec 16 15:24:54 crc kubenswrapper[4775]: I1216 15:24:54.081059 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-sxkbc"] Dec 16 15:24:54 crc kubenswrapper[4775]: I1216 15:24:54.088844 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-sxkbc"] Dec 16 15:24:54 crc kubenswrapper[4775]: I1216 15:24:54.096629 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-aacd-account-create-update-swcfz"] Dec 16 15:24:54 crc kubenswrapper[4775]: I1216 15:24:54.129056 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-bb88-account-create-update-tdzmc"] Dec 16 15:24:54 crc kubenswrapper[4775]: I1216 15:24:54.143724 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-bb88-account-create-update-tdzmc"] Dec 16 15:24:54 crc kubenswrapper[4775]: I1216 15:24:54.158094 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-4908-account-create-update-plzkb"] Dec 16 15:24:54 crc kubenswrapper[4775]: I1216 15:24:54.165708 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7302-account-create-update-n2t2m"] Dec 16 15:24:54 crc kubenswrapper[4775]: I1216 15:24:54.174619 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-lx2gs"] Dec 16 15:24:54 crc kubenswrapper[4775]: I1216 15:24:54.182373 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-4908-account-create-update-plzkb"] Dec 16 15:24:54 crc kubenswrapper[4775]: I1216 15:24:54.191069 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-lx2gs"] Dec 16 15:24:54 crc kubenswrapper[4775]: I1216 15:24:54.200734 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7302-account-create-update-n2t2m"] Dec 16 15:24:55 crc kubenswrapper[4775]: I1216 15:24:55.358098 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a4cdadf-d811-4803-8dc0-3815d029a281" path="/var/lib/kubelet/pods/2a4cdadf-d811-4803-8dc0-3815d029a281/volumes" Dec 16 15:24:55 crc kubenswrapper[4775]: I1216 15:24:55.359396 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42d94310-d877-4e8d-b540-6526c1c26626" path="/var/lib/kubelet/pods/42d94310-d877-4e8d-b540-6526c1c26626/volumes" Dec 16 15:24:55 crc kubenswrapper[4775]: I1216 15:24:55.360469 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a46193a-2b19-4e7e-b0fe-090319010bba" path="/var/lib/kubelet/pods/6a46193a-2b19-4e7e-b0fe-090319010bba/volumes" Dec 16 15:24:55 crc kubenswrapper[4775]: I1216 15:24:55.361568 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f" path="/var/lib/kubelet/pods/7661fcd3-f70e-4e53-87c1-15ea8ef3fc2f/volumes" Dec 16 15:24:55 crc kubenswrapper[4775]: I1216 15:24:55.363818 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aa308ba-b5b7-4acb-9744-0254560e0a1f" path="/var/lib/kubelet/pods/7aa308ba-b5b7-4acb-9744-0254560e0a1f/volumes" Dec 16 15:24:55 crc kubenswrapper[4775]: I1216 15:24:55.365028 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c404df9-a26b-44f1-bd1a-be3f6877f896" path="/var/lib/kubelet/pods/8c404df9-a26b-44f1-bd1a-be3f6877f896/volumes" Dec 16 15:24:55 crc kubenswrapper[4775]: I1216 15:24:55.366213 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba3c5008-fead-4517-9aed-d02d07560a0a" path="/var/lib/kubelet/pods/ba3c5008-fead-4517-9aed-d02d07560a0a/volumes" Dec 16 15:24:56 crc kubenswrapper[4775]: I1216 15:24:56.338372 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:24:56 crc kubenswrapper[4775]: E1216 15:24:56.339280 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:24:59 crc kubenswrapper[4775]: I1216 15:24:59.029987 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-7lsw2"] Dec 16 15:24:59 crc kubenswrapper[4775]: I1216 15:24:59.038480 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-7lsw2"] Dec 16 15:24:59 crc kubenswrapper[4775]: I1216 15:24:59.347473 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8de491d-4c4f-44bc-82d5-7d571b4920e8" path="/var/lib/kubelet/pods/a8de491d-4c4f-44bc-82d5-7d571b4920e8/volumes" Dec 16 15:25:00 crc kubenswrapper[4775]: I1216 15:25:00.029489 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-q52vm"] Dec 16 15:25:00 crc kubenswrapper[4775]: I1216 15:25:00.039331 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-q52vm"] Dec 16 15:25:01 crc kubenswrapper[4775]: I1216 15:25:01.357632 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeb36c52-8730-43d7-a18e-6f1e3d8312ff" path="/var/lib/kubelet/pods/aeb36c52-8730-43d7-a18e-6f1e3d8312ff/volumes" Dec 16 15:25:08 crc kubenswrapper[4775]: I1216 15:25:08.338633 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:25:08 crc kubenswrapper[4775]: E1216 15:25:08.339370 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:25:19 crc kubenswrapper[4775]: I1216 15:25:19.338290 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:25:19 crc kubenswrapper[4775]: E1216 15:25:19.339088 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:25:34 crc kubenswrapper[4775]: I1216 15:25:34.338206 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:25:34 crc kubenswrapper[4775]: E1216 15:25:34.339081 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:25:37 crc kubenswrapper[4775]: I1216 15:25:37.091529 4775 generic.go:334] "Generic (PLEG): container finished" podID="32966f09-4e16-4fcb-925e-edb1c957cea1" containerID="b25349b770122ce06b00c8a9be3324c781a94c1c9bafc65093af56ffb03c99fe" exitCode=0 Dec 16 15:25:37 crc kubenswrapper[4775]: I1216 15:25:37.091575 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp" event={"ID":"32966f09-4e16-4fcb-925e-edb1c957cea1","Type":"ContainerDied","Data":"b25349b770122ce06b00c8a9be3324c781a94c1c9bafc65093af56ffb03c99fe"} Dec 16 15:25:38 crc kubenswrapper[4775]: I1216 15:25:38.039993 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-5dkn9"] Dec 16 15:25:38 crc kubenswrapper[4775]: I1216 15:25:38.049776 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-5dkn9"] Dec 16 15:25:38 crc kubenswrapper[4775]: I1216 15:25:38.491425 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp" Dec 16 15:25:38 crc kubenswrapper[4775]: I1216 15:25:38.592539 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32966f09-4e16-4fcb-925e-edb1c957cea1-ssh-key\") pod \"32966f09-4e16-4fcb-925e-edb1c957cea1\" (UID: \"32966f09-4e16-4fcb-925e-edb1c957cea1\") " Dec 16 15:25:38 crc kubenswrapper[4775]: I1216 15:25:38.592710 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fkrj\" (UniqueName: \"kubernetes.io/projected/32966f09-4e16-4fcb-925e-edb1c957cea1-kube-api-access-5fkrj\") pod \"32966f09-4e16-4fcb-925e-edb1c957cea1\" (UID: \"32966f09-4e16-4fcb-925e-edb1c957cea1\") " Dec 16 15:25:38 crc kubenswrapper[4775]: I1216 15:25:38.592861 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32966f09-4e16-4fcb-925e-edb1c957cea1-inventory\") pod \"32966f09-4e16-4fcb-925e-edb1c957cea1\" (UID: \"32966f09-4e16-4fcb-925e-edb1c957cea1\") " Dec 16 15:25:38 crc kubenswrapper[4775]: I1216 15:25:38.601253 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32966f09-4e16-4fcb-925e-edb1c957cea1-kube-api-access-5fkrj" (OuterVolumeSpecName: "kube-api-access-5fkrj") pod "32966f09-4e16-4fcb-925e-edb1c957cea1" (UID: "32966f09-4e16-4fcb-925e-edb1c957cea1"). InnerVolumeSpecName "kube-api-access-5fkrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:25:38 crc kubenswrapper[4775]: I1216 15:25:38.622791 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32966f09-4e16-4fcb-925e-edb1c957cea1-inventory" (OuterVolumeSpecName: "inventory") pod "32966f09-4e16-4fcb-925e-edb1c957cea1" (UID: "32966f09-4e16-4fcb-925e-edb1c957cea1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:25:38 crc kubenswrapper[4775]: I1216 15:25:38.636528 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32966f09-4e16-4fcb-925e-edb1c957cea1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "32966f09-4e16-4fcb-925e-edb1c957cea1" (UID: "32966f09-4e16-4fcb-925e-edb1c957cea1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:25:38 crc kubenswrapper[4775]: I1216 15:25:38.695503 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fkrj\" (UniqueName: \"kubernetes.io/projected/32966f09-4e16-4fcb-925e-edb1c957cea1-kube-api-access-5fkrj\") on node \"crc\" DevicePath \"\"" Dec 16 15:25:38 crc kubenswrapper[4775]: I1216 15:25:38.695543 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32966f09-4e16-4fcb-925e-edb1c957cea1-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:25:38 crc kubenswrapper[4775]: I1216 15:25:38.695556 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32966f09-4e16-4fcb-925e-edb1c957cea1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.109263 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp" event={"ID":"32966f09-4e16-4fcb-925e-edb1c957cea1","Type":"ContainerDied","Data":"3c73ffb426eb1b60b99ede8ce14a72ad9d585a008c21db8c9c5bda1877643f89"} Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.109546 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c73ffb426eb1b60b99ede8ce14a72ad9d585a008c21db8c9c5bda1877643f89" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.109313 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.227992 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc"] Dec 16 15:25:39 crc kubenswrapper[4775]: E1216 15:25:39.228673 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32966f09-4e16-4fcb-925e-edb1c957cea1" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.228766 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="32966f09-4e16-4fcb-925e-edb1c957cea1" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.229098 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="32966f09-4e16-4fcb-925e-edb1c957cea1" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.229940 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.241204 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.242260 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc"] Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.242508 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tgv5f" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.242653 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.243018 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.320351 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14cad095-639f-4735-8e83-d5a2abd771c3-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-76qtc\" (UID: \"14cad095-639f-4735-8e83-d5a2abd771c3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.320443 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mncpf\" (UniqueName: \"kubernetes.io/projected/14cad095-639f-4735-8e83-d5a2abd771c3-kube-api-access-mncpf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-76qtc\" (UID: \"14cad095-639f-4735-8e83-d5a2abd771c3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.320505 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14cad095-639f-4735-8e83-d5a2abd771c3-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-76qtc\" (UID: \"14cad095-639f-4735-8e83-d5a2abd771c3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.347900 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63559e2e-2493-4b6f-b5d2-1573af2e9ac6" path="/var/lib/kubelet/pods/63559e2e-2493-4b6f-b5d2-1573af2e9ac6/volumes" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.421692 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14cad095-639f-4735-8e83-d5a2abd771c3-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-76qtc\" (UID: \"14cad095-639f-4735-8e83-d5a2abd771c3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.421805 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mncpf\" (UniqueName: \"kubernetes.io/projected/14cad095-639f-4735-8e83-d5a2abd771c3-kube-api-access-mncpf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-76qtc\" (UID: \"14cad095-639f-4735-8e83-d5a2abd771c3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.421854 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14cad095-639f-4735-8e83-d5a2abd771c3-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-76qtc\" (UID: \"14cad095-639f-4735-8e83-d5a2abd771c3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.426135 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14cad095-639f-4735-8e83-d5a2abd771c3-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-76qtc\" (UID: \"14cad095-639f-4735-8e83-d5a2abd771c3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.428437 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14cad095-639f-4735-8e83-d5a2abd771c3-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-76qtc\" (UID: \"14cad095-639f-4735-8e83-d5a2abd771c3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.441447 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mncpf\" (UniqueName: \"kubernetes.io/projected/14cad095-639f-4735-8e83-d5a2abd771c3-kube-api-access-mncpf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-76qtc\" (UID: \"14cad095-639f-4735-8e83-d5a2abd771c3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc" Dec 16 15:25:39 crc kubenswrapper[4775]: I1216 15:25:39.583541 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc" Dec 16 15:25:40 crc kubenswrapper[4775]: I1216 15:25:40.038126 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-cjjlj"] Dec 16 15:25:40 crc kubenswrapper[4775]: I1216 15:25:40.049946 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-cjjlj"] Dec 16 15:25:40 crc kubenswrapper[4775]: I1216 15:25:40.118977 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc"] Dec 16 15:25:41 crc kubenswrapper[4775]: I1216 15:25:41.033508 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vgjff"] Dec 16 15:25:41 crc kubenswrapper[4775]: I1216 15:25:41.044704 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vgjff"] Dec 16 15:25:41 crc kubenswrapper[4775]: I1216 15:25:41.133672 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc" event={"ID":"14cad095-639f-4735-8e83-d5a2abd771c3","Type":"ContainerStarted","Data":"8738979404a4078c3b34ef2b1c096c2e3302cd1385e5ed9ef41b51349ded9877"} Dec 16 15:25:41 crc kubenswrapper[4775]: I1216 15:25:41.133735 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc" event={"ID":"14cad095-639f-4735-8e83-d5a2abd771c3","Type":"ContainerStarted","Data":"5a1e7c9caad17402fa0ec2f54f3f95bf7532a4897a24d28bcb085e19c7de89b4"} Dec 16 15:25:41 crc kubenswrapper[4775]: I1216 15:25:41.149339 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc" podStartSLOduration=1.5969037670000001 podStartE2EDuration="2.149323226s" podCreationTimestamp="2025-12-16 15:25:39 +0000 UTC" firstStartedPulling="2025-12-16 15:25:40.126741915 +0000 UTC m=+1865.077820838" lastFinishedPulling="2025-12-16 15:25:40.679161374 +0000 UTC m=+1865.630240297" observedRunningTime="2025-12-16 15:25:41.149083178 +0000 UTC m=+1866.100162101" watchObservedRunningTime="2025-12-16 15:25:41.149323226 +0000 UTC m=+1866.100402149" Dec 16 15:25:41 crc kubenswrapper[4775]: I1216 15:25:41.347857 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10d12e79-44d3-4b3a-bd17-af547a42fc19" path="/var/lib/kubelet/pods/10d12e79-44d3-4b3a-bd17-af547a42fc19/volumes" Dec 16 15:25:41 crc kubenswrapper[4775]: I1216 15:25:41.348775 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c437c729-4da8-4394-8863-d0e4c8e73de1" path="/var/lib/kubelet/pods/c437c729-4da8-4394-8863-d0e4c8e73de1/volumes" Dec 16 15:25:47 crc kubenswrapper[4775]: I1216 15:25:47.337857 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:25:47 crc kubenswrapper[4775]: E1216 15:25:47.338478 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:25:50 crc kubenswrapper[4775]: I1216 15:25:50.952823 4775 scope.go:117] "RemoveContainer" containerID="89f45c4e5c40a31bc7b6642a5e24cc71e2ec3ecd178924ef14458eb6a6f20b92" Dec 16 15:25:50 crc kubenswrapper[4775]: I1216 15:25:50.987555 4775 scope.go:117] "RemoveContainer" containerID="c3b9ea8723e61d765a76682c8a3a99f74027968fa26a8cf2943ecfd0066ff450" Dec 16 15:25:51 crc kubenswrapper[4775]: I1216 15:25:51.024017 4775 scope.go:117] "RemoveContainer" containerID="c255aae9b300a2ec635893d98d4d6d4fcabd33d0368b9842e2d0028ed1d8853d" Dec 16 15:25:51 crc kubenswrapper[4775]: I1216 15:25:51.069661 4775 scope.go:117] "RemoveContainer" containerID="472065880d8deff99c6a79d6f32037d302c0ed561ce5bf9802f180450b854c26" Dec 16 15:25:51 crc kubenswrapper[4775]: I1216 15:25:51.125024 4775 scope.go:117] "RemoveContainer" containerID="bf0331e99b5c0e9753ae27008d8cbe195a5971f9f2191725eae2b3aa84b9af32" Dec 16 15:25:51 crc kubenswrapper[4775]: I1216 15:25:51.156490 4775 scope.go:117] "RemoveContainer" containerID="e552ac1296830e5c14dc2da9c5fc2bb36bdc2766b7cf8e914ff749aacb7f0e1d" Dec 16 15:25:51 crc kubenswrapper[4775]: I1216 15:25:51.205165 4775 scope.go:117] "RemoveContainer" containerID="0bfc0a03cc9ec791094f058889bcec04dcf0c29ed44680afd7d2ee54622c2e0c" Dec 16 15:25:51 crc kubenswrapper[4775]: I1216 15:25:51.256106 4775 scope.go:117] "RemoveContainer" containerID="c55627749e8172aa389502866530035ef0c596123f9be4c95c6be21c3cc8a398" Dec 16 15:25:51 crc kubenswrapper[4775]: I1216 15:25:51.278078 4775 scope.go:117] "RemoveContainer" containerID="4e6a0d5c59ff5f9b62a48578ca6acdbd20efe99ea4ca344c216212989790000d" Dec 16 15:25:51 crc kubenswrapper[4775]: I1216 15:25:51.300487 4775 scope.go:117] "RemoveContainer" containerID="2e9743547ab763bfb0e9a21cdce7c236a60c2e8ffb9ccd6357d6a629480e4dad" Dec 16 15:25:51 crc kubenswrapper[4775]: I1216 15:25:51.323681 4775 scope.go:117] "RemoveContainer" containerID="0178115a38cede9ffec018b174720c1bb83b25f7ccd8307084b884ddf5703460" Dec 16 15:25:51 crc kubenswrapper[4775]: I1216 15:25:51.342140 4775 scope.go:117] "RemoveContainer" containerID="09b9c4b99da956b010933ec11e32239a5835d68ab66ab71457c14fb53743123d" Dec 16 15:25:51 crc kubenswrapper[4775]: I1216 15:25:51.367302 4775 scope.go:117] "RemoveContainer" containerID="ea3cdaaf6dcc755350d4442c8d87f386b372864221f0a0de2ffdad55737f6e7d" Dec 16 15:25:58 crc kubenswrapper[4775]: I1216 15:25:58.045531 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-sw8rd"] Dec 16 15:25:58 crc kubenswrapper[4775]: I1216 15:25:58.053304 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-9j6xx"] Dec 16 15:25:58 crc kubenswrapper[4775]: I1216 15:25:58.062460 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-sw8rd"] Dec 16 15:25:58 crc kubenswrapper[4775]: I1216 15:25:58.069974 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-9j6xx"] Dec 16 15:25:58 crc kubenswrapper[4775]: I1216 15:25:58.337934 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:25:58 crc kubenswrapper[4775]: E1216 15:25:58.338534 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:25:59 crc kubenswrapper[4775]: I1216 15:25:59.031387 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-j4mx8"] Dec 16 15:25:59 crc kubenswrapper[4775]: I1216 15:25:59.039959 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-j4mx8"] Dec 16 15:25:59 crc kubenswrapper[4775]: I1216 15:25:59.349085 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23611da1-3f26-42c4-bd23-36e0b04bdc24" path="/var/lib/kubelet/pods/23611da1-3f26-42c4-bd23-36e0b04bdc24/volumes" Dec 16 15:25:59 crc kubenswrapper[4775]: I1216 15:25:59.349697 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aae2a99-cf8f-4bdc-a5a0-4d548dcde207" path="/var/lib/kubelet/pods/6aae2a99-cf8f-4bdc-a5a0-4d548dcde207/volumes" Dec 16 15:25:59 crc kubenswrapper[4775]: I1216 15:25:59.350384 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de018e03-657b-4eec-8b94-2d305f9bdbcf" path="/var/lib/kubelet/pods/de018e03-657b-4eec-8b94-2d305f9bdbcf/volumes" Dec 16 15:26:12 crc kubenswrapper[4775]: I1216 15:26:12.337819 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:26:12 crc kubenswrapper[4775]: E1216 15:26:12.338680 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:26:25 crc kubenswrapper[4775]: I1216 15:26:25.346323 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:26:25 crc kubenswrapper[4775]: E1216 15:26:25.347058 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:26:37 crc kubenswrapper[4775]: I1216 15:26:37.043805 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-svtt9"] Dec 16 15:26:37 crc kubenswrapper[4775]: I1216 15:26:37.057996 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-svtt9"] Dec 16 15:26:37 crc kubenswrapper[4775]: I1216 15:26:37.351134 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27246978-4a11-4370-9635-71ef44e99b6c" path="/var/lib/kubelet/pods/27246978-4a11-4370-9635-71ef44e99b6c/volumes" Dec 16 15:26:40 crc kubenswrapper[4775]: I1216 15:26:40.338874 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:26:40 crc kubenswrapper[4775]: E1216 15:26:40.340183 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:26:41 crc kubenswrapper[4775]: I1216 15:26:41.031767 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-68dw8"] Dec 16 15:26:41 crc kubenswrapper[4775]: I1216 15:26:41.041455 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-52a3-account-create-update-zq6jv"] Dec 16 15:26:41 crc kubenswrapper[4775]: I1216 15:26:41.048217 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-68dw8"] Dec 16 15:26:41 crc kubenswrapper[4775]: I1216 15:26:41.054638 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-52a3-account-create-update-zq6jv"] Dec 16 15:26:41 crc kubenswrapper[4775]: I1216 15:26:41.350096 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="454beaa2-a30a-4b5f-bb64-95eafaa20360" path="/var/lib/kubelet/pods/454beaa2-a30a-4b5f-bb64-95eafaa20360/volumes" Dec 16 15:26:41 crc kubenswrapper[4775]: I1216 15:26:41.352634 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5c39639-d2db-4abd-a474-d141a0d0af35" path="/var/lib/kubelet/pods/b5c39639-d2db-4abd-a474-d141a0d0af35/volumes" Dec 16 15:26:42 crc kubenswrapper[4775]: I1216 15:26:42.034518 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-9435-account-create-update-td6k7"] Dec 16 15:26:42 crc kubenswrapper[4775]: I1216 15:26:42.043012 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-41a9-account-create-update-fnj2g"] Dec 16 15:26:42 crc kubenswrapper[4775]: I1216 15:26:42.050545 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-8vlhw"] Dec 16 15:26:42 crc kubenswrapper[4775]: I1216 15:26:42.057065 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-9435-account-create-update-td6k7"] Dec 16 15:26:42 crc kubenswrapper[4775]: I1216 15:26:42.063342 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-41a9-account-create-update-fnj2g"] Dec 16 15:26:42 crc kubenswrapper[4775]: I1216 15:26:42.070797 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-8vlhw"] Dec 16 15:26:43 crc kubenswrapper[4775]: I1216 15:26:43.357943 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="382550fb-c9fc-4100-a196-8ab11975d0ad" path="/var/lib/kubelet/pods/382550fb-c9fc-4100-a196-8ab11975d0ad/volumes" Dec 16 15:26:43 crc kubenswrapper[4775]: I1216 15:26:43.358797 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a87cdc28-aa31-4446-a1a6-e0904f9daa62" path="/var/lib/kubelet/pods/a87cdc28-aa31-4446-a1a6-e0904f9daa62/volumes" Dec 16 15:26:43 crc kubenswrapper[4775]: I1216 15:26:43.359368 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c18fb7c5-60d7-497d-8031-4d3c073104a6" path="/var/lib/kubelet/pods/c18fb7c5-60d7-497d-8031-4d3c073104a6/volumes" Dec 16 15:26:51 crc kubenswrapper[4775]: I1216 15:26:51.613414 4775 scope.go:117] "RemoveContainer" containerID="50def315ca15d81b803f1d97fa51972c7bc6a3f2a28d7e9266064f84971336e6" Dec 16 15:26:51 crc kubenswrapper[4775]: I1216 15:26:51.638779 4775 scope.go:117] "RemoveContainer" containerID="56addecc1e52f86ca9007267a1f53cb04b8bc81de1ab3438dd6b3916e668a6df" Dec 16 15:26:51 crc kubenswrapper[4775]: I1216 15:26:51.685217 4775 scope.go:117] "RemoveContainer" containerID="553f825b26017179918aa15e17aad6de8ef28c4963820bef5b53cc7beed7a785" Dec 16 15:26:51 crc kubenswrapper[4775]: I1216 15:26:51.755597 4775 scope.go:117] "RemoveContainer" containerID="82de921d2b8a698c8cdd1fc8123013c7b791c624d3bc666653375f4a26a7a447" Dec 16 15:26:51 crc kubenswrapper[4775]: I1216 15:26:51.780319 4775 scope.go:117] "RemoveContainer" containerID="d803eff0ab69221595095bfe69b917868b310e315085c0ba307c7dc6f7ad4058" Dec 16 15:26:51 crc kubenswrapper[4775]: I1216 15:26:51.824596 4775 scope.go:117] "RemoveContainer" containerID="0bb60f91a6b144e70dc4dc03c61d897e0b0e04862929f94b4ec663398674d364" Dec 16 15:26:51 crc kubenswrapper[4775]: I1216 15:26:51.875469 4775 scope.go:117] "RemoveContainer" containerID="1346e61d1ead4c8b5724ad39b2813ecbc5c8bc421af45fe7093e0975737aa786" Dec 16 15:26:51 crc kubenswrapper[4775]: I1216 15:26:51.895345 4775 scope.go:117] "RemoveContainer" containerID="e78dc26c846c00548fc65746fd5409d68567cc52037fcba64378d9ad4be2b989" Dec 16 15:26:51 crc kubenswrapper[4775]: I1216 15:26:51.913595 4775 scope.go:117] "RemoveContainer" containerID="d9a19fea1ed0cd5dce2d31249b71ac0e2c5481f696e72971d6b95d27b3d61533" Dec 16 15:26:53 crc kubenswrapper[4775]: I1216 15:26:53.343697 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:26:53 crc kubenswrapper[4775]: E1216 15:26:53.343917 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:26:53 crc kubenswrapper[4775]: I1216 15:26:53.800276 4775 generic.go:334] "Generic (PLEG): container finished" podID="14cad095-639f-4735-8e83-d5a2abd771c3" containerID="8738979404a4078c3b34ef2b1c096c2e3302cd1385e5ed9ef41b51349ded9877" exitCode=0 Dec 16 15:26:53 crc kubenswrapper[4775]: I1216 15:26:53.800368 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc" event={"ID":"14cad095-639f-4735-8e83-d5a2abd771c3","Type":"ContainerDied","Data":"8738979404a4078c3b34ef2b1c096c2e3302cd1385e5ed9ef41b51349ded9877"} Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.191780 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc" Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.337211 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14cad095-639f-4735-8e83-d5a2abd771c3-inventory\") pod \"14cad095-639f-4735-8e83-d5a2abd771c3\" (UID: \"14cad095-639f-4735-8e83-d5a2abd771c3\") " Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.337380 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14cad095-639f-4735-8e83-d5a2abd771c3-ssh-key\") pod \"14cad095-639f-4735-8e83-d5a2abd771c3\" (UID: \"14cad095-639f-4735-8e83-d5a2abd771c3\") " Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.337429 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mncpf\" (UniqueName: \"kubernetes.io/projected/14cad095-639f-4735-8e83-d5a2abd771c3-kube-api-access-mncpf\") pod \"14cad095-639f-4735-8e83-d5a2abd771c3\" (UID: \"14cad095-639f-4735-8e83-d5a2abd771c3\") " Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.344924 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14cad095-639f-4735-8e83-d5a2abd771c3-kube-api-access-mncpf" (OuterVolumeSpecName: "kube-api-access-mncpf") pod "14cad095-639f-4735-8e83-d5a2abd771c3" (UID: "14cad095-639f-4735-8e83-d5a2abd771c3"). InnerVolumeSpecName "kube-api-access-mncpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.368212 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14cad095-639f-4735-8e83-d5a2abd771c3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "14cad095-639f-4735-8e83-d5a2abd771c3" (UID: "14cad095-639f-4735-8e83-d5a2abd771c3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.372710 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14cad095-639f-4735-8e83-d5a2abd771c3-inventory" (OuterVolumeSpecName: "inventory") pod "14cad095-639f-4735-8e83-d5a2abd771c3" (UID: "14cad095-639f-4735-8e83-d5a2abd771c3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.439572 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14cad095-639f-4735-8e83-d5a2abd771c3-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.439606 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14cad095-639f-4735-8e83-d5a2abd771c3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.439620 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mncpf\" (UniqueName: \"kubernetes.io/projected/14cad095-639f-4735-8e83-d5a2abd771c3-kube-api-access-mncpf\") on node \"crc\" DevicePath \"\"" Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.819221 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc" event={"ID":"14cad095-639f-4735-8e83-d5a2abd771c3","Type":"ContainerDied","Data":"5a1e7c9caad17402fa0ec2f54f3f95bf7532a4897a24d28bcb085e19c7de89b4"} Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.819534 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a1e7c9caad17402fa0ec2f54f3f95bf7532a4897a24d28bcb085e19c7de89b4" Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.819291 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-76qtc" Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.907432 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4"] Dec 16 15:26:55 crc kubenswrapper[4775]: E1216 15:26:55.907809 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cad095-639f-4735-8e83-d5a2abd771c3" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.907826 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cad095-639f-4735-8e83-d5a2abd771c3" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.908044 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cad095-639f-4735-8e83-d5a2abd771c3" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.908631 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4" Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.911159 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.911409 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.911566 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.911715 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tgv5f" Dec 16 15:26:55 crc kubenswrapper[4775]: I1216 15:26:55.925229 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4"] Dec 16 15:26:56 crc kubenswrapper[4775]: I1216 15:26:56.053978 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55b74b45-197a-47f8-88cf-ce675418f3ca-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4\" (UID: \"55b74b45-197a-47f8-88cf-ce675418f3ca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4" Dec 16 15:26:56 crc kubenswrapper[4775]: I1216 15:26:56.054054 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68l5f\" (UniqueName: \"kubernetes.io/projected/55b74b45-197a-47f8-88cf-ce675418f3ca-kube-api-access-68l5f\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4\" (UID: \"55b74b45-197a-47f8-88cf-ce675418f3ca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4" Dec 16 15:26:56 crc kubenswrapper[4775]: I1216 15:26:56.054081 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55b74b45-197a-47f8-88cf-ce675418f3ca-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4\" (UID: \"55b74b45-197a-47f8-88cf-ce675418f3ca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4" Dec 16 15:26:56 crc kubenswrapper[4775]: I1216 15:26:56.156121 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55b74b45-197a-47f8-88cf-ce675418f3ca-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4\" (UID: \"55b74b45-197a-47f8-88cf-ce675418f3ca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4" Dec 16 15:26:56 crc kubenswrapper[4775]: I1216 15:26:56.156181 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68l5f\" (UniqueName: \"kubernetes.io/projected/55b74b45-197a-47f8-88cf-ce675418f3ca-kube-api-access-68l5f\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4\" (UID: \"55b74b45-197a-47f8-88cf-ce675418f3ca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4" Dec 16 15:26:56 crc kubenswrapper[4775]: I1216 15:26:56.156252 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55b74b45-197a-47f8-88cf-ce675418f3ca-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4\" (UID: \"55b74b45-197a-47f8-88cf-ce675418f3ca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4" Dec 16 15:26:56 crc kubenswrapper[4775]: I1216 15:26:56.161414 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55b74b45-197a-47f8-88cf-ce675418f3ca-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4\" (UID: \"55b74b45-197a-47f8-88cf-ce675418f3ca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4" Dec 16 15:26:56 crc kubenswrapper[4775]: I1216 15:26:56.161565 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55b74b45-197a-47f8-88cf-ce675418f3ca-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4\" (UID: \"55b74b45-197a-47f8-88cf-ce675418f3ca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4" Dec 16 15:26:56 crc kubenswrapper[4775]: I1216 15:26:56.176784 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68l5f\" (UniqueName: \"kubernetes.io/projected/55b74b45-197a-47f8-88cf-ce675418f3ca-kube-api-access-68l5f\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4\" (UID: \"55b74b45-197a-47f8-88cf-ce675418f3ca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4" Dec 16 15:26:56 crc kubenswrapper[4775]: I1216 15:26:56.229533 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4" Dec 16 15:26:56 crc kubenswrapper[4775]: I1216 15:26:56.750628 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4"] Dec 16 15:26:56 crc kubenswrapper[4775]: W1216 15:26:56.765693 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55b74b45_197a_47f8_88cf_ce675418f3ca.slice/crio-c90b40b0d9035e4bc807d23a6198e2bf839bff9efaa7f66abedf6d916d253867 WatchSource:0}: Error finding container c90b40b0d9035e4bc807d23a6198e2bf839bff9efaa7f66abedf6d916d253867: Status 404 returned error can't find the container with id c90b40b0d9035e4bc807d23a6198e2bf839bff9efaa7f66abedf6d916d253867 Dec 16 15:26:56 crc kubenswrapper[4775]: I1216 15:26:56.829721 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4" event={"ID":"55b74b45-197a-47f8-88cf-ce675418f3ca","Type":"ContainerStarted","Data":"c90b40b0d9035e4bc807d23a6198e2bf839bff9efaa7f66abedf6d916d253867"} Dec 16 15:26:57 crc kubenswrapper[4775]: I1216 15:26:57.842251 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4" event={"ID":"55b74b45-197a-47f8-88cf-ce675418f3ca","Type":"ContainerStarted","Data":"740efb82b0fee3d65bedc39fdb1ddd6e72fde9dd311bdeddb3aa091cf73dd869"} Dec 16 15:26:57 crc kubenswrapper[4775]: I1216 15:26:57.859735 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4" podStartSLOduration=2.282124372 podStartE2EDuration="2.859699168s" podCreationTimestamp="2025-12-16 15:26:55 +0000 UTC" firstStartedPulling="2025-12-16 15:26:56.772761502 +0000 UTC m=+1941.723840425" lastFinishedPulling="2025-12-16 15:26:57.350336298 +0000 UTC m=+1942.301415221" observedRunningTime="2025-12-16 15:26:57.858971235 +0000 UTC m=+1942.810050188" watchObservedRunningTime="2025-12-16 15:26:57.859699168 +0000 UTC m=+1942.810778121" Dec 16 15:27:02 crc kubenswrapper[4775]: I1216 15:27:02.891922 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4" event={"ID":"55b74b45-197a-47f8-88cf-ce675418f3ca","Type":"ContainerDied","Data":"740efb82b0fee3d65bedc39fdb1ddd6e72fde9dd311bdeddb3aa091cf73dd869"} Dec 16 15:27:02 crc kubenswrapper[4775]: I1216 15:27:02.891923 4775 generic.go:334] "Generic (PLEG): container finished" podID="55b74b45-197a-47f8-88cf-ce675418f3ca" containerID="740efb82b0fee3d65bedc39fdb1ddd6e72fde9dd311bdeddb3aa091cf73dd869" exitCode=0 Dec 16 15:27:04 crc kubenswrapper[4775]: I1216 15:27:04.298363 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4" Dec 16 15:27:04 crc kubenswrapper[4775]: I1216 15:27:04.434603 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55b74b45-197a-47f8-88cf-ce675418f3ca-ssh-key\") pod \"55b74b45-197a-47f8-88cf-ce675418f3ca\" (UID: \"55b74b45-197a-47f8-88cf-ce675418f3ca\") " Dec 16 15:27:04 crc kubenswrapper[4775]: I1216 15:27:04.434656 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55b74b45-197a-47f8-88cf-ce675418f3ca-inventory\") pod \"55b74b45-197a-47f8-88cf-ce675418f3ca\" (UID: \"55b74b45-197a-47f8-88cf-ce675418f3ca\") " Dec 16 15:27:04 crc kubenswrapper[4775]: I1216 15:27:04.434824 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68l5f\" (UniqueName: \"kubernetes.io/projected/55b74b45-197a-47f8-88cf-ce675418f3ca-kube-api-access-68l5f\") pod \"55b74b45-197a-47f8-88cf-ce675418f3ca\" (UID: \"55b74b45-197a-47f8-88cf-ce675418f3ca\") " Dec 16 15:27:04 crc kubenswrapper[4775]: I1216 15:27:04.440224 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55b74b45-197a-47f8-88cf-ce675418f3ca-kube-api-access-68l5f" (OuterVolumeSpecName: "kube-api-access-68l5f") pod "55b74b45-197a-47f8-88cf-ce675418f3ca" (UID: "55b74b45-197a-47f8-88cf-ce675418f3ca"). InnerVolumeSpecName "kube-api-access-68l5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:27:04 crc kubenswrapper[4775]: I1216 15:27:04.459715 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55b74b45-197a-47f8-88cf-ce675418f3ca-inventory" (OuterVolumeSpecName: "inventory") pod "55b74b45-197a-47f8-88cf-ce675418f3ca" (UID: "55b74b45-197a-47f8-88cf-ce675418f3ca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:27:04 crc kubenswrapper[4775]: I1216 15:27:04.468041 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55b74b45-197a-47f8-88cf-ce675418f3ca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "55b74b45-197a-47f8-88cf-ce675418f3ca" (UID: "55b74b45-197a-47f8-88cf-ce675418f3ca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:27:04 crc kubenswrapper[4775]: I1216 15:27:04.537095 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55b74b45-197a-47f8-88cf-ce675418f3ca-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:27:04 crc kubenswrapper[4775]: I1216 15:27:04.537133 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55b74b45-197a-47f8-88cf-ce675418f3ca-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:27:04 crc kubenswrapper[4775]: I1216 15:27:04.537142 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68l5f\" (UniqueName: \"kubernetes.io/projected/55b74b45-197a-47f8-88cf-ce675418f3ca-kube-api-access-68l5f\") on node \"crc\" DevicePath \"\"" Dec 16 15:27:04 crc kubenswrapper[4775]: I1216 15:27:04.909656 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4" event={"ID":"55b74b45-197a-47f8-88cf-ce675418f3ca","Type":"ContainerDied","Data":"c90b40b0d9035e4bc807d23a6198e2bf839bff9efaa7f66abedf6d916d253867"} Dec 16 15:27:04 crc kubenswrapper[4775]: I1216 15:27:04.909999 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c90b40b0d9035e4bc807d23a6198e2bf839bff9efaa7f66abedf6d916d253867" Dec 16 15:27:04 crc kubenswrapper[4775]: I1216 15:27:04.909713 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.030692 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6"] Dec 16 15:27:05 crc kubenswrapper[4775]: E1216 15:27:05.031141 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55b74b45-197a-47f8-88cf-ce675418f3ca" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.031162 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="55b74b45-197a-47f8-88cf-ce675418f3ca" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.031348 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="55b74b45-197a-47f8-88cf-ce675418f3ca" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.031994 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.035483 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tgv5f" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.035953 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.036224 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.036458 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.047253 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6"] Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.147419 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c99310e2-070c-4bed-b14d-850dfd069353-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lv5v6\" (UID: \"c99310e2-070c-4bed-b14d-850dfd069353\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.147708 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c99310e2-070c-4bed-b14d-850dfd069353-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lv5v6\" (UID: \"c99310e2-070c-4bed-b14d-850dfd069353\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.147834 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chg6z\" (UniqueName: \"kubernetes.io/projected/c99310e2-070c-4bed-b14d-850dfd069353-kube-api-access-chg6z\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lv5v6\" (UID: \"c99310e2-070c-4bed-b14d-850dfd069353\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.249190 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c99310e2-070c-4bed-b14d-850dfd069353-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lv5v6\" (UID: \"c99310e2-070c-4bed-b14d-850dfd069353\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.249281 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c99310e2-070c-4bed-b14d-850dfd069353-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lv5v6\" (UID: \"c99310e2-070c-4bed-b14d-850dfd069353\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.249358 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chg6z\" (UniqueName: \"kubernetes.io/projected/c99310e2-070c-4bed-b14d-850dfd069353-kube-api-access-chg6z\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lv5v6\" (UID: \"c99310e2-070c-4bed-b14d-850dfd069353\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.253761 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c99310e2-070c-4bed-b14d-850dfd069353-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lv5v6\" (UID: \"c99310e2-070c-4bed-b14d-850dfd069353\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.263627 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c99310e2-070c-4bed-b14d-850dfd069353-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lv5v6\" (UID: \"c99310e2-070c-4bed-b14d-850dfd069353\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.266405 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chg6z\" (UniqueName: \"kubernetes.io/projected/c99310e2-070c-4bed-b14d-850dfd069353-kube-api-access-chg6z\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lv5v6\" (UID: \"c99310e2-070c-4bed-b14d-850dfd069353\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.354666 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6" Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.858989 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6"] Dec 16 15:27:05 crc kubenswrapper[4775]: I1216 15:27:05.918719 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6" event={"ID":"c99310e2-070c-4bed-b14d-850dfd069353","Type":"ContainerStarted","Data":"bc7bc556062c4aff900e31fdbd221536bfd44d30f8348b68e4d03af4a31d6355"} Dec 16 15:27:06 crc kubenswrapper[4775]: I1216 15:27:06.928971 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6" event={"ID":"c99310e2-070c-4bed-b14d-850dfd069353","Type":"ContainerStarted","Data":"4cf0de4bfb5d834b296c1b6b732413f2492f8ee1dfcf29ff63f37a5b0be985eb"} Dec 16 15:27:06 crc kubenswrapper[4775]: I1216 15:27:06.949772 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6" podStartSLOduration=1.445690482 podStartE2EDuration="1.949747646s" podCreationTimestamp="2025-12-16 15:27:05 +0000 UTC" firstStartedPulling="2025-12-16 15:27:05.863869033 +0000 UTC m=+1950.814947956" lastFinishedPulling="2025-12-16 15:27:06.367926177 +0000 UTC m=+1951.319005120" observedRunningTime="2025-12-16 15:27:06.941832047 +0000 UTC m=+1951.892911000" watchObservedRunningTime="2025-12-16 15:27:06.949747646 +0000 UTC m=+1951.900826569" Dec 16 15:27:07 crc kubenswrapper[4775]: I1216 15:27:07.347585 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:27:07 crc kubenswrapper[4775]: E1216 15:27:07.348051 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:27:13 crc kubenswrapper[4775]: I1216 15:27:13.051143 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6sj6z"] Dec 16 15:27:13 crc kubenswrapper[4775]: I1216 15:27:13.059845 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6sj6z"] Dec 16 15:27:13 crc kubenswrapper[4775]: I1216 15:27:13.351862 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="862dcea5-6162-4150-84d2-69baeced1f01" path="/var/lib/kubelet/pods/862dcea5-6162-4150-84d2-69baeced1f01/volumes" Dec 16 15:27:22 crc kubenswrapper[4775]: I1216 15:27:22.337841 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:27:22 crc kubenswrapper[4775]: E1216 15:27:22.339865 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:27:35 crc kubenswrapper[4775]: I1216 15:27:35.345265 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:27:35 crc kubenswrapper[4775]: I1216 15:27:35.622212 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerStarted","Data":"4a69bb344960c504a6cfe9e1c8feab0a47fc248099223e066857479364957b64"} Dec 16 15:27:37 crc kubenswrapper[4775]: I1216 15:27:37.060638 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-smpjc"] Dec 16 15:27:37 crc kubenswrapper[4775]: I1216 15:27:37.068018 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-smpjc"] Dec 16 15:27:37 crc kubenswrapper[4775]: I1216 15:27:37.350356 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46586ca7-367e-47d4-bd95-11037f7bb60f" path="/var/lib/kubelet/pods/46586ca7-367e-47d4-bd95-11037f7bb60f/volumes" Dec 16 15:27:39 crc kubenswrapper[4775]: I1216 15:27:39.029478 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jlnfn"] Dec 16 15:27:39 crc kubenswrapper[4775]: I1216 15:27:39.036795 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jlnfn"] Dec 16 15:27:39 crc kubenswrapper[4775]: I1216 15:27:39.351711 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b63d44e7-07c5-48d1-bd00-6e8be2b2c889" path="/var/lib/kubelet/pods/b63d44e7-07c5-48d1-bd00-6e8be2b2c889/volumes" Dec 16 15:27:45 crc kubenswrapper[4775]: I1216 15:27:45.836810 4775 generic.go:334] "Generic (PLEG): container finished" podID="c99310e2-070c-4bed-b14d-850dfd069353" containerID="4cf0de4bfb5d834b296c1b6b732413f2492f8ee1dfcf29ff63f37a5b0be985eb" exitCode=0 Dec 16 15:27:45 crc kubenswrapper[4775]: I1216 15:27:45.837000 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6" event={"ID":"c99310e2-070c-4bed-b14d-850dfd069353","Type":"ContainerDied","Data":"4cf0de4bfb5d834b296c1b6b732413f2492f8ee1dfcf29ff63f37a5b0be985eb"} Dec 16 15:27:47 crc kubenswrapper[4775]: I1216 15:27:47.298782 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6" Dec 16 15:27:47 crc kubenswrapper[4775]: I1216 15:27:47.411074 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chg6z\" (UniqueName: \"kubernetes.io/projected/c99310e2-070c-4bed-b14d-850dfd069353-kube-api-access-chg6z\") pod \"c99310e2-070c-4bed-b14d-850dfd069353\" (UID: \"c99310e2-070c-4bed-b14d-850dfd069353\") " Dec 16 15:27:47 crc kubenswrapper[4775]: I1216 15:27:47.411223 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c99310e2-070c-4bed-b14d-850dfd069353-inventory\") pod \"c99310e2-070c-4bed-b14d-850dfd069353\" (UID: \"c99310e2-070c-4bed-b14d-850dfd069353\") " Dec 16 15:27:47 crc kubenswrapper[4775]: I1216 15:27:47.411416 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c99310e2-070c-4bed-b14d-850dfd069353-ssh-key\") pod \"c99310e2-070c-4bed-b14d-850dfd069353\" (UID: \"c99310e2-070c-4bed-b14d-850dfd069353\") " Dec 16 15:27:47 crc kubenswrapper[4775]: I1216 15:27:47.417092 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c99310e2-070c-4bed-b14d-850dfd069353-kube-api-access-chg6z" (OuterVolumeSpecName: "kube-api-access-chg6z") pod "c99310e2-070c-4bed-b14d-850dfd069353" (UID: "c99310e2-070c-4bed-b14d-850dfd069353"). InnerVolumeSpecName "kube-api-access-chg6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:27:47 crc kubenswrapper[4775]: I1216 15:27:47.439944 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c99310e2-070c-4bed-b14d-850dfd069353-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c99310e2-070c-4bed-b14d-850dfd069353" (UID: "c99310e2-070c-4bed-b14d-850dfd069353"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:27:47 crc kubenswrapper[4775]: I1216 15:27:47.441306 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c99310e2-070c-4bed-b14d-850dfd069353-inventory" (OuterVolumeSpecName: "inventory") pod "c99310e2-070c-4bed-b14d-850dfd069353" (UID: "c99310e2-070c-4bed-b14d-850dfd069353"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:27:47 crc kubenswrapper[4775]: I1216 15:27:47.514423 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c99310e2-070c-4bed-b14d-850dfd069353-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:27:47 crc kubenswrapper[4775]: I1216 15:27:47.514702 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chg6z\" (UniqueName: \"kubernetes.io/projected/c99310e2-070c-4bed-b14d-850dfd069353-kube-api-access-chg6z\") on node \"crc\" DevicePath \"\"" Dec 16 15:27:47 crc kubenswrapper[4775]: I1216 15:27:47.514787 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c99310e2-070c-4bed-b14d-850dfd069353-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:27:47 crc kubenswrapper[4775]: I1216 15:27:47.855856 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6" event={"ID":"c99310e2-070c-4bed-b14d-850dfd069353","Type":"ContainerDied","Data":"bc7bc556062c4aff900e31fdbd221536bfd44d30f8348b68e4d03af4a31d6355"} Dec 16 15:27:47 crc kubenswrapper[4775]: I1216 15:27:47.855919 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc7bc556062c4aff900e31fdbd221536bfd44d30f8348b68e4d03af4a31d6355" Dec 16 15:27:47 crc kubenswrapper[4775]: I1216 15:27:47.856227 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv5v6" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.086484 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7"] Dec 16 15:27:48 crc kubenswrapper[4775]: E1216 15:27:48.086958 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99310e2-070c-4bed-b14d-850dfd069353" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.086979 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99310e2-070c-4bed-b14d-850dfd069353" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.087217 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c99310e2-070c-4bed-b14d-850dfd069353" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.088067 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.091108 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.091372 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tgv5f" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.091561 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.095548 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7"] Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.108268 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.133711 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/410c8945-6eac-4dd6-943b-a2024de59d58-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7\" (UID: \"410c8945-6eac-4dd6-943b-a2024de59d58\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.133793 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/410c8945-6eac-4dd6-943b-a2024de59d58-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7\" (UID: \"410c8945-6eac-4dd6-943b-a2024de59d58\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.134023 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjrnj\" (UniqueName: \"kubernetes.io/projected/410c8945-6eac-4dd6-943b-a2024de59d58-kube-api-access-xjrnj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7\" (UID: \"410c8945-6eac-4dd6-943b-a2024de59d58\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.235419 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/410c8945-6eac-4dd6-943b-a2024de59d58-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7\" (UID: \"410c8945-6eac-4dd6-943b-a2024de59d58\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.235504 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/410c8945-6eac-4dd6-943b-a2024de59d58-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7\" (UID: \"410c8945-6eac-4dd6-943b-a2024de59d58\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.235600 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjrnj\" (UniqueName: \"kubernetes.io/projected/410c8945-6eac-4dd6-943b-a2024de59d58-kube-api-access-xjrnj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7\" (UID: \"410c8945-6eac-4dd6-943b-a2024de59d58\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.245048 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/410c8945-6eac-4dd6-943b-a2024de59d58-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7\" (UID: \"410c8945-6eac-4dd6-943b-a2024de59d58\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.245192 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/410c8945-6eac-4dd6-943b-a2024de59d58-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7\" (UID: \"410c8945-6eac-4dd6-943b-a2024de59d58\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.258871 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjrnj\" (UniqueName: \"kubernetes.io/projected/410c8945-6eac-4dd6-943b-a2024de59d58-kube-api-access-xjrnj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7\" (UID: \"410c8945-6eac-4dd6-943b-a2024de59d58\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.414072 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7" Dec 16 15:27:48 crc kubenswrapper[4775]: I1216 15:27:48.942841 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7"] Dec 16 15:27:48 crc kubenswrapper[4775]: W1216 15:27:48.949510 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod410c8945_6eac_4dd6_943b_a2024de59d58.slice/crio-384ad78dd46a56050a810a165a91ef2da3fc5767ebc4f16330e5bd15c71d1454 WatchSource:0}: Error finding container 384ad78dd46a56050a810a165a91ef2da3fc5767ebc4f16330e5bd15c71d1454: Status 404 returned error can't find the container with id 384ad78dd46a56050a810a165a91ef2da3fc5767ebc4f16330e5bd15c71d1454 Dec 16 15:27:49 crc kubenswrapper[4775]: I1216 15:27:49.880156 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7" event={"ID":"410c8945-6eac-4dd6-943b-a2024de59d58","Type":"ContainerStarted","Data":"b5c26c3fc3eccc72f041386252008fdefac1c15d74da4d62ac62c2aad27aae72"} Dec 16 15:27:49 crc kubenswrapper[4775]: I1216 15:27:49.880485 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7" event={"ID":"410c8945-6eac-4dd6-943b-a2024de59d58","Type":"ContainerStarted","Data":"384ad78dd46a56050a810a165a91ef2da3fc5767ebc4f16330e5bd15c71d1454"} Dec 16 15:27:49 crc kubenswrapper[4775]: I1216 15:27:49.901548 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7" podStartSLOduration=1.304560747 podStartE2EDuration="1.901527793s" podCreationTimestamp="2025-12-16 15:27:48 +0000 UTC" firstStartedPulling="2025-12-16 15:27:48.95239551 +0000 UTC m=+1993.903474433" lastFinishedPulling="2025-12-16 15:27:49.549362556 +0000 UTC m=+1994.500441479" observedRunningTime="2025-12-16 15:27:49.89765354 +0000 UTC m=+1994.848732493" watchObservedRunningTime="2025-12-16 15:27:49.901527793 +0000 UTC m=+1994.852606716" Dec 16 15:27:52 crc kubenswrapper[4775]: I1216 15:27:52.091572 4775 scope.go:117] "RemoveContainer" containerID="e2dbca579561629389f9c4485b9f0fd64e8db5ddf0326341697d07388e8e4994" Dec 16 15:27:52 crc kubenswrapper[4775]: I1216 15:27:52.149191 4775 scope.go:117] "RemoveContainer" containerID="1e50f2d806020593672b0862872199c85bf7b84ee4c99cc922d90b773a8fe8a6" Dec 16 15:27:52 crc kubenswrapper[4775]: I1216 15:27:52.203378 4775 scope.go:117] "RemoveContainer" containerID="3635d72ed63ee3edda4b94047de6593c62df3a0c20e58402bbce0c2173380187" Dec 16 15:28:22 crc kubenswrapper[4775]: I1216 15:28:22.049146 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-bdkkf"] Dec 16 15:28:22 crc kubenswrapper[4775]: I1216 15:28:22.062286 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-bdkkf"] Dec 16 15:28:23 crc kubenswrapper[4775]: I1216 15:28:23.348468 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f16cf76e-4507-4b0a-aefd-c32b2b0763f1" path="/var/lib/kubelet/pods/f16cf76e-4507-4b0a-aefd-c32b2b0763f1/volumes" Dec 16 15:28:34 crc kubenswrapper[4775]: I1216 15:28:34.101770 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4kblw"] Dec 16 15:28:34 crc kubenswrapper[4775]: I1216 15:28:34.105109 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4kblw" Dec 16 15:28:34 crc kubenswrapper[4775]: I1216 15:28:34.115666 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4kblw"] Dec 16 15:28:34 crc kubenswrapper[4775]: I1216 15:28:34.219291 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n7dn\" (UniqueName: \"kubernetes.io/projected/4b8ef266-59b2-42c8-a776-ac22d1cdf15c-kube-api-access-9n7dn\") pod \"redhat-operators-4kblw\" (UID: \"4b8ef266-59b2-42c8-a776-ac22d1cdf15c\") " pod="openshift-marketplace/redhat-operators-4kblw" Dec 16 15:28:34 crc kubenswrapper[4775]: I1216 15:28:34.219626 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b8ef266-59b2-42c8-a776-ac22d1cdf15c-utilities\") pod \"redhat-operators-4kblw\" (UID: \"4b8ef266-59b2-42c8-a776-ac22d1cdf15c\") " pod="openshift-marketplace/redhat-operators-4kblw" Dec 16 15:28:34 crc kubenswrapper[4775]: I1216 15:28:34.219780 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b8ef266-59b2-42c8-a776-ac22d1cdf15c-catalog-content\") pod \"redhat-operators-4kblw\" (UID: \"4b8ef266-59b2-42c8-a776-ac22d1cdf15c\") " pod="openshift-marketplace/redhat-operators-4kblw" Dec 16 15:28:34 crc kubenswrapper[4775]: I1216 15:28:34.321275 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n7dn\" (UniqueName: \"kubernetes.io/projected/4b8ef266-59b2-42c8-a776-ac22d1cdf15c-kube-api-access-9n7dn\") pod \"redhat-operators-4kblw\" (UID: \"4b8ef266-59b2-42c8-a776-ac22d1cdf15c\") " pod="openshift-marketplace/redhat-operators-4kblw" Dec 16 15:28:34 crc kubenswrapper[4775]: I1216 15:28:34.321640 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b8ef266-59b2-42c8-a776-ac22d1cdf15c-utilities\") pod \"redhat-operators-4kblw\" (UID: \"4b8ef266-59b2-42c8-a776-ac22d1cdf15c\") " pod="openshift-marketplace/redhat-operators-4kblw" Dec 16 15:28:34 crc kubenswrapper[4775]: I1216 15:28:34.321825 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b8ef266-59b2-42c8-a776-ac22d1cdf15c-catalog-content\") pod \"redhat-operators-4kblw\" (UID: \"4b8ef266-59b2-42c8-a776-ac22d1cdf15c\") " pod="openshift-marketplace/redhat-operators-4kblw" Dec 16 15:28:34 crc kubenswrapper[4775]: I1216 15:28:34.322269 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b8ef266-59b2-42c8-a776-ac22d1cdf15c-utilities\") pod \"redhat-operators-4kblw\" (UID: \"4b8ef266-59b2-42c8-a776-ac22d1cdf15c\") " pod="openshift-marketplace/redhat-operators-4kblw" Dec 16 15:28:34 crc kubenswrapper[4775]: I1216 15:28:34.322623 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b8ef266-59b2-42c8-a776-ac22d1cdf15c-catalog-content\") pod \"redhat-operators-4kblw\" (UID: \"4b8ef266-59b2-42c8-a776-ac22d1cdf15c\") " pod="openshift-marketplace/redhat-operators-4kblw" Dec 16 15:28:34 crc kubenswrapper[4775]: I1216 15:28:34.347132 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n7dn\" (UniqueName: \"kubernetes.io/projected/4b8ef266-59b2-42c8-a776-ac22d1cdf15c-kube-api-access-9n7dn\") pod \"redhat-operators-4kblw\" (UID: \"4b8ef266-59b2-42c8-a776-ac22d1cdf15c\") " pod="openshift-marketplace/redhat-operators-4kblw" Dec 16 15:28:34 crc kubenswrapper[4775]: I1216 15:28:34.443470 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4kblw" Dec 16 15:28:34 crc kubenswrapper[4775]: I1216 15:28:34.942141 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4kblw"] Dec 16 15:28:35 crc kubenswrapper[4775]: I1216 15:28:35.479314 4775 generic.go:334] "Generic (PLEG): container finished" podID="4b8ef266-59b2-42c8-a776-ac22d1cdf15c" containerID="b9b4d3744221d0015ffafe272fc0c8800137d53539bab46ce5854cefd7ef33ab" exitCode=0 Dec 16 15:28:35 crc kubenswrapper[4775]: I1216 15:28:35.479374 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kblw" event={"ID":"4b8ef266-59b2-42c8-a776-ac22d1cdf15c","Type":"ContainerDied","Data":"b9b4d3744221d0015ffafe272fc0c8800137d53539bab46ce5854cefd7ef33ab"} Dec 16 15:28:35 crc kubenswrapper[4775]: I1216 15:28:35.479409 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kblw" event={"ID":"4b8ef266-59b2-42c8-a776-ac22d1cdf15c","Type":"ContainerStarted","Data":"275201cadf658e53c27055755096ba6069253347f23ce6e1e0848361da2bdd83"} Dec 16 15:28:36 crc kubenswrapper[4775]: I1216 15:28:36.492503 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kblw" event={"ID":"4b8ef266-59b2-42c8-a776-ac22d1cdf15c","Type":"ContainerStarted","Data":"ead2086a94ddde3ee601252da2545af98a0c0fc335d6ce5f4d85c62bad387ee7"} Dec 16 15:28:39 crc kubenswrapper[4775]: I1216 15:28:39.525824 4775 generic.go:334] "Generic (PLEG): container finished" podID="4b8ef266-59b2-42c8-a776-ac22d1cdf15c" containerID="ead2086a94ddde3ee601252da2545af98a0c0fc335d6ce5f4d85c62bad387ee7" exitCode=0 Dec 16 15:28:39 crc kubenswrapper[4775]: I1216 15:28:39.525927 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kblw" event={"ID":"4b8ef266-59b2-42c8-a776-ac22d1cdf15c","Type":"ContainerDied","Data":"ead2086a94ddde3ee601252da2545af98a0c0fc335d6ce5f4d85c62bad387ee7"} Dec 16 15:28:41 crc kubenswrapper[4775]: I1216 15:28:41.545878 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kblw" event={"ID":"4b8ef266-59b2-42c8-a776-ac22d1cdf15c","Type":"ContainerStarted","Data":"003d435fbdd7b3f91f6e95f966e30f25ca994840c8ed8361728b07bd532b2a3d"} Dec 16 15:28:41 crc kubenswrapper[4775]: I1216 15:28:41.547977 4775 generic.go:334] "Generic (PLEG): container finished" podID="410c8945-6eac-4dd6-943b-a2024de59d58" containerID="b5c26c3fc3eccc72f041386252008fdefac1c15d74da4d62ac62c2aad27aae72" exitCode=0 Dec 16 15:28:41 crc kubenswrapper[4775]: I1216 15:28:41.548024 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7" event={"ID":"410c8945-6eac-4dd6-943b-a2024de59d58","Type":"ContainerDied","Data":"b5c26c3fc3eccc72f041386252008fdefac1c15d74da4d62ac62c2aad27aae72"} Dec 16 15:28:41 crc kubenswrapper[4775]: I1216 15:28:41.564780 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4kblw" podStartSLOduration=2.659320349 podStartE2EDuration="7.564760389s" podCreationTimestamp="2025-12-16 15:28:34 +0000 UTC" firstStartedPulling="2025-12-16 15:28:35.481244295 +0000 UTC m=+2040.432323218" lastFinishedPulling="2025-12-16 15:28:40.386684335 +0000 UTC m=+2045.337763258" observedRunningTime="2025-12-16 15:28:41.560845205 +0000 UTC m=+2046.511924158" watchObservedRunningTime="2025-12-16 15:28:41.564760389 +0000 UTC m=+2046.515839322" Dec 16 15:28:42 crc kubenswrapper[4775]: I1216 15:28:42.965984 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.106400 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/410c8945-6eac-4dd6-943b-a2024de59d58-inventory\") pod \"410c8945-6eac-4dd6-943b-a2024de59d58\" (UID: \"410c8945-6eac-4dd6-943b-a2024de59d58\") " Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.106561 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjrnj\" (UniqueName: \"kubernetes.io/projected/410c8945-6eac-4dd6-943b-a2024de59d58-kube-api-access-xjrnj\") pod \"410c8945-6eac-4dd6-943b-a2024de59d58\" (UID: \"410c8945-6eac-4dd6-943b-a2024de59d58\") " Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.106661 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/410c8945-6eac-4dd6-943b-a2024de59d58-ssh-key\") pod \"410c8945-6eac-4dd6-943b-a2024de59d58\" (UID: \"410c8945-6eac-4dd6-943b-a2024de59d58\") " Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.114835 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/410c8945-6eac-4dd6-943b-a2024de59d58-kube-api-access-xjrnj" (OuterVolumeSpecName: "kube-api-access-xjrnj") pod "410c8945-6eac-4dd6-943b-a2024de59d58" (UID: "410c8945-6eac-4dd6-943b-a2024de59d58"). InnerVolumeSpecName "kube-api-access-xjrnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.134718 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410c8945-6eac-4dd6-943b-a2024de59d58-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "410c8945-6eac-4dd6-943b-a2024de59d58" (UID: "410c8945-6eac-4dd6-943b-a2024de59d58"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.146963 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410c8945-6eac-4dd6-943b-a2024de59d58-inventory" (OuterVolumeSpecName: "inventory") pod "410c8945-6eac-4dd6-943b-a2024de59d58" (UID: "410c8945-6eac-4dd6-943b-a2024de59d58"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.208285 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/410c8945-6eac-4dd6-943b-a2024de59d58-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.208328 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/410c8945-6eac-4dd6-943b-a2024de59d58-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.208339 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjrnj\" (UniqueName: \"kubernetes.io/projected/410c8945-6eac-4dd6-943b-a2024de59d58-kube-api-access-xjrnj\") on node \"crc\" DevicePath \"\"" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.565256 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7" event={"ID":"410c8945-6eac-4dd6-943b-a2024de59d58","Type":"ContainerDied","Data":"384ad78dd46a56050a810a165a91ef2da3fc5767ebc4f16330e5bd15c71d1454"} Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.565294 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="384ad78dd46a56050a810a165a91ef2da3fc5767ebc4f16330e5bd15c71d1454" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.565328 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.663954 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jf2hn"] Dec 16 15:28:43 crc kubenswrapper[4775]: E1216 15:28:43.664374 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410c8945-6eac-4dd6-943b-a2024de59d58" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.664399 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="410c8945-6eac-4dd6-943b-a2024de59d58" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.664651 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="410c8945-6eac-4dd6-943b-a2024de59d58" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.665451 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jf2hn" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.669198 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tgv5f" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.669433 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.669592 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.669706 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.674914 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jf2hn"] Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.717352 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2lnt\" (UniqueName: \"kubernetes.io/projected/67d72872-cd76-413f-bcbe-e0c6da3a8f5a-kube-api-access-r2lnt\") pod \"ssh-known-hosts-edpm-deployment-jf2hn\" (UID: \"67d72872-cd76-413f-bcbe-e0c6da3a8f5a\") " pod="openstack/ssh-known-hosts-edpm-deployment-jf2hn" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.717451 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67d72872-cd76-413f-bcbe-e0c6da3a8f5a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jf2hn\" (UID: \"67d72872-cd76-413f-bcbe-e0c6da3a8f5a\") " pod="openstack/ssh-known-hosts-edpm-deployment-jf2hn" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.717518 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/67d72872-cd76-413f-bcbe-e0c6da3a8f5a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jf2hn\" (UID: \"67d72872-cd76-413f-bcbe-e0c6da3a8f5a\") " pod="openstack/ssh-known-hosts-edpm-deployment-jf2hn" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.820107 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/67d72872-cd76-413f-bcbe-e0c6da3a8f5a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jf2hn\" (UID: \"67d72872-cd76-413f-bcbe-e0c6da3a8f5a\") " pod="openstack/ssh-known-hosts-edpm-deployment-jf2hn" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.820489 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2lnt\" (UniqueName: \"kubernetes.io/projected/67d72872-cd76-413f-bcbe-e0c6da3a8f5a-kube-api-access-r2lnt\") pod \"ssh-known-hosts-edpm-deployment-jf2hn\" (UID: \"67d72872-cd76-413f-bcbe-e0c6da3a8f5a\") " pod="openstack/ssh-known-hosts-edpm-deployment-jf2hn" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.820678 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67d72872-cd76-413f-bcbe-e0c6da3a8f5a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jf2hn\" (UID: \"67d72872-cd76-413f-bcbe-e0c6da3a8f5a\") " pod="openstack/ssh-known-hosts-edpm-deployment-jf2hn" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.824936 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67d72872-cd76-413f-bcbe-e0c6da3a8f5a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jf2hn\" (UID: \"67d72872-cd76-413f-bcbe-e0c6da3a8f5a\") " pod="openstack/ssh-known-hosts-edpm-deployment-jf2hn" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.825918 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/67d72872-cd76-413f-bcbe-e0c6da3a8f5a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jf2hn\" (UID: \"67d72872-cd76-413f-bcbe-e0c6da3a8f5a\") " pod="openstack/ssh-known-hosts-edpm-deployment-jf2hn" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.873843 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2lnt\" (UniqueName: \"kubernetes.io/projected/67d72872-cd76-413f-bcbe-e0c6da3a8f5a-kube-api-access-r2lnt\") pod \"ssh-known-hosts-edpm-deployment-jf2hn\" (UID: \"67d72872-cd76-413f-bcbe-e0c6da3a8f5a\") " pod="openstack/ssh-known-hosts-edpm-deployment-jf2hn" Dec 16 15:28:43 crc kubenswrapper[4775]: I1216 15:28:43.983741 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jf2hn" Dec 16 15:28:44 crc kubenswrapper[4775]: I1216 15:28:44.444538 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4kblw" Dec 16 15:28:44 crc kubenswrapper[4775]: I1216 15:28:44.444785 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4kblw" Dec 16 15:28:44 crc kubenswrapper[4775]: I1216 15:28:44.500342 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jf2hn"] Dec 16 15:28:44 crc kubenswrapper[4775]: W1216 15:28:44.501746 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67d72872_cd76_413f_bcbe_e0c6da3a8f5a.slice/crio-c21f64059ae7d68dd8430dbe3680a2f493b724d826d62bec2bdf62efd31aab71 WatchSource:0}: Error finding container c21f64059ae7d68dd8430dbe3680a2f493b724d826d62bec2bdf62efd31aab71: Status 404 returned error can't find the container with id c21f64059ae7d68dd8430dbe3680a2f493b724d826d62bec2bdf62efd31aab71 Dec 16 15:28:44 crc kubenswrapper[4775]: I1216 15:28:44.575043 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jf2hn" event={"ID":"67d72872-cd76-413f-bcbe-e0c6da3a8f5a","Type":"ContainerStarted","Data":"c21f64059ae7d68dd8430dbe3680a2f493b724d826d62bec2bdf62efd31aab71"} Dec 16 15:28:45 crc kubenswrapper[4775]: I1216 15:28:45.490793 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4kblw" podUID="4b8ef266-59b2-42c8-a776-ac22d1cdf15c" containerName="registry-server" probeResult="failure" output=< Dec 16 15:28:45 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Dec 16 15:28:45 crc kubenswrapper[4775]: > Dec 16 15:28:45 crc kubenswrapper[4775]: I1216 15:28:45.603067 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jf2hn" event={"ID":"67d72872-cd76-413f-bcbe-e0c6da3a8f5a","Type":"ContainerStarted","Data":"f5760e33d8f2228ff073e02767137288ed6b18ace7de0e05d8a1b2fd2b7b1b38"} Dec 16 15:28:45 crc kubenswrapper[4775]: I1216 15:28:45.620335 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-jf2hn" podStartSLOduration=2.166292444 podStartE2EDuration="2.620314848s" podCreationTimestamp="2025-12-16 15:28:43 +0000 UTC" firstStartedPulling="2025-12-16 15:28:44.504042096 +0000 UTC m=+2049.455121039" lastFinishedPulling="2025-12-16 15:28:44.95806452 +0000 UTC m=+2049.909143443" observedRunningTime="2025-12-16 15:28:45.619093759 +0000 UTC m=+2050.570172692" watchObservedRunningTime="2025-12-16 15:28:45.620314848 +0000 UTC m=+2050.571393771" Dec 16 15:28:52 crc kubenswrapper[4775]: I1216 15:28:52.304314 4775 scope.go:117] "RemoveContainer" containerID="63525ee91b85db663a55315ccf39419df3bc609346f6df513788091ea6679d1d" Dec 16 15:28:52 crc kubenswrapper[4775]: I1216 15:28:52.667779 4775 generic.go:334] "Generic (PLEG): container finished" podID="67d72872-cd76-413f-bcbe-e0c6da3a8f5a" containerID="f5760e33d8f2228ff073e02767137288ed6b18ace7de0e05d8a1b2fd2b7b1b38" exitCode=0 Dec 16 15:28:52 crc kubenswrapper[4775]: I1216 15:28:52.668107 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jf2hn" event={"ID":"67d72872-cd76-413f-bcbe-e0c6da3a8f5a","Type":"ContainerDied","Data":"f5760e33d8f2228ff073e02767137288ed6b18ace7de0e05d8a1b2fd2b7b1b38"} Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.102969 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jf2hn" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.224973 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67d72872-cd76-413f-bcbe-e0c6da3a8f5a-ssh-key-openstack-edpm-ipam\") pod \"67d72872-cd76-413f-bcbe-e0c6da3a8f5a\" (UID: \"67d72872-cd76-413f-bcbe-e0c6da3a8f5a\") " Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.225033 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2lnt\" (UniqueName: \"kubernetes.io/projected/67d72872-cd76-413f-bcbe-e0c6da3a8f5a-kube-api-access-r2lnt\") pod \"67d72872-cd76-413f-bcbe-e0c6da3a8f5a\" (UID: \"67d72872-cd76-413f-bcbe-e0c6da3a8f5a\") " Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.225210 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/67d72872-cd76-413f-bcbe-e0c6da3a8f5a-inventory-0\") pod \"67d72872-cd76-413f-bcbe-e0c6da3a8f5a\" (UID: \"67d72872-cd76-413f-bcbe-e0c6da3a8f5a\") " Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.230710 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d72872-cd76-413f-bcbe-e0c6da3a8f5a-kube-api-access-r2lnt" (OuterVolumeSpecName: "kube-api-access-r2lnt") pod "67d72872-cd76-413f-bcbe-e0c6da3a8f5a" (UID: "67d72872-cd76-413f-bcbe-e0c6da3a8f5a"). InnerVolumeSpecName "kube-api-access-r2lnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.251194 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d72872-cd76-413f-bcbe-e0c6da3a8f5a-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "67d72872-cd76-413f-bcbe-e0c6da3a8f5a" (UID: "67d72872-cd76-413f-bcbe-e0c6da3a8f5a"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.252321 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d72872-cd76-413f-bcbe-e0c6da3a8f5a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "67d72872-cd76-413f-bcbe-e0c6da3a8f5a" (UID: "67d72872-cd76-413f-bcbe-e0c6da3a8f5a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.328527 4775 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/67d72872-cd76-413f-bcbe-e0c6da3a8f5a-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.328555 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67d72872-cd76-413f-bcbe-e0c6da3a8f5a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.328565 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2lnt\" (UniqueName: \"kubernetes.io/projected/67d72872-cd76-413f-bcbe-e0c6da3a8f5a-kube-api-access-r2lnt\") on node \"crc\" DevicePath \"\"" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.493143 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4kblw" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.539355 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4kblw" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.689451 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jf2hn" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.689496 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jf2hn" event={"ID":"67d72872-cd76-413f-bcbe-e0c6da3a8f5a","Type":"ContainerDied","Data":"c21f64059ae7d68dd8430dbe3680a2f493b724d826d62bec2bdf62efd31aab71"} Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.689538 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c21f64059ae7d68dd8430dbe3680a2f493b724d826d62bec2bdf62efd31aab71" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.736318 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4kblw"] Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.770368 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k"] Dec 16 15:28:54 crc kubenswrapper[4775]: E1216 15:28:54.770842 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d72872-cd76-413f-bcbe-e0c6da3a8f5a" containerName="ssh-known-hosts-edpm-deployment" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.770866 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d72872-cd76-413f-bcbe-e0c6da3a8f5a" containerName="ssh-known-hosts-edpm-deployment" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.771119 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d72872-cd76-413f-bcbe-e0c6da3a8f5a" containerName="ssh-known-hosts-edpm-deployment" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.771805 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.773505 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.773815 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.773822 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.774048 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tgv5f" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.793186 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k"] Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.838248 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fzwb\" (UniqueName: \"kubernetes.io/projected/e0ba352c-17c3-4c36-b409-83485c265668-kube-api-access-7fzwb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tnb9k\" (UID: \"e0ba352c-17c3-4c36-b409-83485c265668\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.838367 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0ba352c-17c3-4c36-b409-83485c265668-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tnb9k\" (UID: \"e0ba352c-17c3-4c36-b409-83485c265668\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.838416 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0ba352c-17c3-4c36-b409-83485c265668-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tnb9k\" (UID: \"e0ba352c-17c3-4c36-b409-83485c265668\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.940525 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0ba352c-17c3-4c36-b409-83485c265668-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tnb9k\" (UID: \"e0ba352c-17c3-4c36-b409-83485c265668\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.941527 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0ba352c-17c3-4c36-b409-83485c265668-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tnb9k\" (UID: \"e0ba352c-17c3-4c36-b409-83485c265668\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.941682 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fzwb\" (UniqueName: \"kubernetes.io/projected/e0ba352c-17c3-4c36-b409-83485c265668-kube-api-access-7fzwb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tnb9k\" (UID: \"e0ba352c-17c3-4c36-b409-83485c265668\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.944962 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0ba352c-17c3-4c36-b409-83485c265668-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tnb9k\" (UID: \"e0ba352c-17c3-4c36-b409-83485c265668\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.945748 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0ba352c-17c3-4c36-b409-83485c265668-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tnb9k\" (UID: \"e0ba352c-17c3-4c36-b409-83485c265668\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k" Dec 16 15:28:54 crc kubenswrapper[4775]: I1216 15:28:54.968043 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fzwb\" (UniqueName: \"kubernetes.io/projected/e0ba352c-17c3-4c36-b409-83485c265668-kube-api-access-7fzwb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-tnb9k\" (UID: \"e0ba352c-17c3-4c36-b409-83485c265668\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k" Dec 16 15:28:55 crc kubenswrapper[4775]: I1216 15:28:55.136330 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k" Dec 16 15:28:55 crc kubenswrapper[4775]: I1216 15:28:55.628405 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k"] Dec 16 15:28:55 crc kubenswrapper[4775]: I1216 15:28:55.698314 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k" event={"ID":"e0ba352c-17c3-4c36-b409-83485c265668","Type":"ContainerStarted","Data":"3c9f960b592b5c3250d2e22738a7d344d2b9783158fa41b73402c28802b6a4b1"} Dec 16 15:28:55 crc kubenswrapper[4775]: I1216 15:28:55.698496 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4kblw" podUID="4b8ef266-59b2-42c8-a776-ac22d1cdf15c" containerName="registry-server" containerID="cri-o://003d435fbdd7b3f91f6e95f966e30f25ca994840c8ed8361728b07bd532b2a3d" gracePeriod=2 Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.183345 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4kblw" Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.264074 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b8ef266-59b2-42c8-a776-ac22d1cdf15c-utilities\") pod \"4b8ef266-59b2-42c8-a776-ac22d1cdf15c\" (UID: \"4b8ef266-59b2-42c8-a776-ac22d1cdf15c\") " Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.264145 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b8ef266-59b2-42c8-a776-ac22d1cdf15c-catalog-content\") pod \"4b8ef266-59b2-42c8-a776-ac22d1cdf15c\" (UID: \"4b8ef266-59b2-42c8-a776-ac22d1cdf15c\") " Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.265238 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b8ef266-59b2-42c8-a776-ac22d1cdf15c-utilities" (OuterVolumeSpecName: "utilities") pod "4b8ef266-59b2-42c8-a776-ac22d1cdf15c" (UID: "4b8ef266-59b2-42c8-a776-ac22d1cdf15c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.266519 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n7dn\" (UniqueName: \"kubernetes.io/projected/4b8ef266-59b2-42c8-a776-ac22d1cdf15c-kube-api-access-9n7dn\") pod \"4b8ef266-59b2-42c8-a776-ac22d1cdf15c\" (UID: \"4b8ef266-59b2-42c8-a776-ac22d1cdf15c\") " Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.267001 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b8ef266-59b2-42c8-a776-ac22d1cdf15c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.270197 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b8ef266-59b2-42c8-a776-ac22d1cdf15c-kube-api-access-9n7dn" (OuterVolumeSpecName: "kube-api-access-9n7dn") pod "4b8ef266-59b2-42c8-a776-ac22d1cdf15c" (UID: "4b8ef266-59b2-42c8-a776-ac22d1cdf15c"). InnerVolumeSpecName "kube-api-access-9n7dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.369951 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n7dn\" (UniqueName: \"kubernetes.io/projected/4b8ef266-59b2-42c8-a776-ac22d1cdf15c-kube-api-access-9n7dn\") on node \"crc\" DevicePath \"\"" Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.386213 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b8ef266-59b2-42c8-a776-ac22d1cdf15c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b8ef266-59b2-42c8-a776-ac22d1cdf15c" (UID: "4b8ef266-59b2-42c8-a776-ac22d1cdf15c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.471940 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b8ef266-59b2-42c8-a776-ac22d1cdf15c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.708690 4775 generic.go:334] "Generic (PLEG): container finished" podID="4b8ef266-59b2-42c8-a776-ac22d1cdf15c" containerID="003d435fbdd7b3f91f6e95f966e30f25ca994840c8ed8361728b07bd532b2a3d" exitCode=0 Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.708742 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kblw" event={"ID":"4b8ef266-59b2-42c8-a776-ac22d1cdf15c","Type":"ContainerDied","Data":"003d435fbdd7b3f91f6e95f966e30f25ca994840c8ed8361728b07bd532b2a3d"} Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.709044 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kblw" event={"ID":"4b8ef266-59b2-42c8-a776-ac22d1cdf15c","Type":"ContainerDied","Data":"275201cadf658e53c27055755096ba6069253347f23ce6e1e0848361da2bdd83"} Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.708761 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4kblw" Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.709064 4775 scope.go:117] "RemoveContainer" containerID="003d435fbdd7b3f91f6e95f966e30f25ca994840c8ed8361728b07bd532b2a3d" Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.711123 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k" event={"ID":"e0ba352c-17c3-4c36-b409-83485c265668","Type":"ContainerStarted","Data":"b2ff597f54aeb115ca12f07540175b64ff65b0cc106988ca95fc2aa1e9801ea6"} Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.740381 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k" podStartSLOduration=2.167507086 podStartE2EDuration="2.740358992s" podCreationTimestamp="2025-12-16 15:28:54 +0000 UTC" firstStartedPulling="2025-12-16 15:28:55.633793286 +0000 UTC m=+2060.584872219" lastFinishedPulling="2025-12-16 15:28:56.206645192 +0000 UTC m=+2061.157724125" observedRunningTime="2025-12-16 15:28:56.733558797 +0000 UTC m=+2061.684637720" watchObservedRunningTime="2025-12-16 15:28:56.740358992 +0000 UTC m=+2061.691437915" Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.755370 4775 scope.go:117] "RemoveContainer" containerID="ead2086a94ddde3ee601252da2545af98a0c0fc335d6ce5f4d85c62bad387ee7" Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.768135 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4kblw"] Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.774657 4775 scope.go:117] "RemoveContainer" containerID="b9b4d3744221d0015ffafe272fc0c8800137d53539bab46ce5854cefd7ef33ab" Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.775319 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4kblw"] Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.795460 4775 scope.go:117] "RemoveContainer" containerID="003d435fbdd7b3f91f6e95f966e30f25ca994840c8ed8361728b07bd532b2a3d" Dec 16 15:28:56 crc kubenswrapper[4775]: E1216 15:28:56.795690 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"003d435fbdd7b3f91f6e95f966e30f25ca994840c8ed8361728b07bd532b2a3d\": container with ID starting with 003d435fbdd7b3f91f6e95f966e30f25ca994840c8ed8361728b07bd532b2a3d not found: ID does not exist" containerID="003d435fbdd7b3f91f6e95f966e30f25ca994840c8ed8361728b07bd532b2a3d" Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.795718 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"003d435fbdd7b3f91f6e95f966e30f25ca994840c8ed8361728b07bd532b2a3d"} err="failed to get container status \"003d435fbdd7b3f91f6e95f966e30f25ca994840c8ed8361728b07bd532b2a3d\": rpc error: code = NotFound desc = could not find container \"003d435fbdd7b3f91f6e95f966e30f25ca994840c8ed8361728b07bd532b2a3d\": container with ID starting with 003d435fbdd7b3f91f6e95f966e30f25ca994840c8ed8361728b07bd532b2a3d not found: ID does not exist" Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.795740 4775 scope.go:117] "RemoveContainer" containerID="ead2086a94ddde3ee601252da2545af98a0c0fc335d6ce5f4d85c62bad387ee7" Dec 16 15:28:56 crc kubenswrapper[4775]: E1216 15:28:56.796005 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ead2086a94ddde3ee601252da2545af98a0c0fc335d6ce5f4d85c62bad387ee7\": container with ID starting with ead2086a94ddde3ee601252da2545af98a0c0fc335d6ce5f4d85c62bad387ee7 not found: ID does not exist" containerID="ead2086a94ddde3ee601252da2545af98a0c0fc335d6ce5f4d85c62bad387ee7" Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.796106 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ead2086a94ddde3ee601252da2545af98a0c0fc335d6ce5f4d85c62bad387ee7"} err="failed to get container status \"ead2086a94ddde3ee601252da2545af98a0c0fc335d6ce5f4d85c62bad387ee7\": rpc error: code = NotFound desc = could not find container \"ead2086a94ddde3ee601252da2545af98a0c0fc335d6ce5f4d85c62bad387ee7\": container with ID starting with ead2086a94ddde3ee601252da2545af98a0c0fc335d6ce5f4d85c62bad387ee7 not found: ID does not exist" Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.796187 4775 scope.go:117] "RemoveContainer" containerID="b9b4d3744221d0015ffafe272fc0c8800137d53539bab46ce5854cefd7ef33ab" Dec 16 15:28:56 crc kubenswrapper[4775]: E1216 15:28:56.796558 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9b4d3744221d0015ffafe272fc0c8800137d53539bab46ce5854cefd7ef33ab\": container with ID starting with b9b4d3744221d0015ffafe272fc0c8800137d53539bab46ce5854cefd7ef33ab not found: ID does not exist" containerID="b9b4d3744221d0015ffafe272fc0c8800137d53539bab46ce5854cefd7ef33ab" Dec 16 15:28:56 crc kubenswrapper[4775]: I1216 15:28:56.796622 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9b4d3744221d0015ffafe272fc0c8800137d53539bab46ce5854cefd7ef33ab"} err="failed to get container status \"b9b4d3744221d0015ffafe272fc0c8800137d53539bab46ce5854cefd7ef33ab\": rpc error: code = NotFound desc = could not find container \"b9b4d3744221d0015ffafe272fc0c8800137d53539bab46ce5854cefd7ef33ab\": container with ID starting with b9b4d3744221d0015ffafe272fc0c8800137d53539bab46ce5854cefd7ef33ab not found: ID does not exist" Dec 16 15:28:57 crc kubenswrapper[4775]: I1216 15:28:57.351265 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b8ef266-59b2-42c8-a776-ac22d1cdf15c" path="/var/lib/kubelet/pods/4b8ef266-59b2-42c8-a776-ac22d1cdf15c/volumes" Dec 16 15:29:04 crc kubenswrapper[4775]: I1216 15:29:04.801296 4775 generic.go:334] "Generic (PLEG): container finished" podID="e0ba352c-17c3-4c36-b409-83485c265668" containerID="b2ff597f54aeb115ca12f07540175b64ff65b0cc106988ca95fc2aa1e9801ea6" exitCode=0 Dec 16 15:29:04 crc kubenswrapper[4775]: I1216 15:29:04.801437 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k" event={"ID":"e0ba352c-17c3-4c36-b409-83485c265668","Type":"ContainerDied","Data":"b2ff597f54aeb115ca12f07540175b64ff65b0cc106988ca95fc2aa1e9801ea6"} Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.211426 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.255551 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0ba352c-17c3-4c36-b409-83485c265668-inventory\") pod \"e0ba352c-17c3-4c36-b409-83485c265668\" (UID: \"e0ba352c-17c3-4c36-b409-83485c265668\") " Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.255665 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0ba352c-17c3-4c36-b409-83485c265668-ssh-key\") pod \"e0ba352c-17c3-4c36-b409-83485c265668\" (UID: \"e0ba352c-17c3-4c36-b409-83485c265668\") " Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.255747 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fzwb\" (UniqueName: \"kubernetes.io/projected/e0ba352c-17c3-4c36-b409-83485c265668-kube-api-access-7fzwb\") pod \"e0ba352c-17c3-4c36-b409-83485c265668\" (UID: \"e0ba352c-17c3-4c36-b409-83485c265668\") " Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.261486 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ba352c-17c3-4c36-b409-83485c265668-kube-api-access-7fzwb" (OuterVolumeSpecName: "kube-api-access-7fzwb") pod "e0ba352c-17c3-4c36-b409-83485c265668" (UID: "e0ba352c-17c3-4c36-b409-83485c265668"). InnerVolumeSpecName "kube-api-access-7fzwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.281424 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ba352c-17c3-4c36-b409-83485c265668-inventory" (OuterVolumeSpecName: "inventory") pod "e0ba352c-17c3-4c36-b409-83485c265668" (UID: "e0ba352c-17c3-4c36-b409-83485c265668"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.281696 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ba352c-17c3-4c36-b409-83485c265668-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e0ba352c-17c3-4c36-b409-83485c265668" (UID: "e0ba352c-17c3-4c36-b409-83485c265668"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.357819 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0ba352c-17c3-4c36-b409-83485c265668-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.357868 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0ba352c-17c3-4c36-b409-83485c265668-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.357880 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fzwb\" (UniqueName: \"kubernetes.io/projected/e0ba352c-17c3-4c36-b409-83485c265668-kube-api-access-7fzwb\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.821253 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k" event={"ID":"e0ba352c-17c3-4c36-b409-83485c265668","Type":"ContainerDied","Data":"3c9f960b592b5c3250d2e22738a7d344d2b9783158fa41b73402c28802b6a4b1"} Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.821300 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c9f960b592b5c3250d2e22738a7d344d2b9783158fa41b73402c28802b6a4b1" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.821400 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-tnb9k" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.892728 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f"] Dec 16 15:29:06 crc kubenswrapper[4775]: E1216 15:29:06.893439 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b8ef266-59b2-42c8-a776-ac22d1cdf15c" containerName="registry-server" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.893518 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b8ef266-59b2-42c8-a776-ac22d1cdf15c" containerName="registry-server" Dec 16 15:29:06 crc kubenswrapper[4775]: E1216 15:29:06.893571 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b8ef266-59b2-42c8-a776-ac22d1cdf15c" containerName="extract-content" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.893627 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b8ef266-59b2-42c8-a776-ac22d1cdf15c" containerName="extract-content" Dec 16 15:29:06 crc kubenswrapper[4775]: E1216 15:29:06.893708 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ba352c-17c3-4c36-b409-83485c265668" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.893798 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ba352c-17c3-4c36-b409-83485c265668" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:29:06 crc kubenswrapper[4775]: E1216 15:29:06.893920 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b8ef266-59b2-42c8-a776-ac22d1cdf15c" containerName="extract-utilities" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.893985 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b8ef266-59b2-42c8-a776-ac22d1cdf15c" containerName="extract-utilities" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.894319 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b8ef266-59b2-42c8-a776-ac22d1cdf15c" containerName="registry-server" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.894411 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ba352c-17c3-4c36-b409-83485c265668" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.895131 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.897715 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.897788 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.899382 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tgv5f" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.899665 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.911552 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f"] Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.973833 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dab0db60-d31f-4e9d-b17e-5dea1fdc90cb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f\" (UID: \"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.974084 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdtmm\" (UniqueName: \"kubernetes.io/projected/dab0db60-d31f-4e9d-b17e-5dea1fdc90cb-kube-api-access-zdtmm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f\" (UID: \"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f" Dec 16 15:29:06 crc kubenswrapper[4775]: I1216 15:29:06.974162 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dab0db60-d31f-4e9d-b17e-5dea1fdc90cb-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f\" (UID: \"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f" Dec 16 15:29:07 crc kubenswrapper[4775]: I1216 15:29:07.075653 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdtmm\" (UniqueName: \"kubernetes.io/projected/dab0db60-d31f-4e9d-b17e-5dea1fdc90cb-kube-api-access-zdtmm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f\" (UID: \"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f" Dec 16 15:29:07 crc kubenswrapper[4775]: I1216 15:29:07.075726 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dab0db60-d31f-4e9d-b17e-5dea1fdc90cb-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f\" (UID: \"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f" Dec 16 15:29:07 crc kubenswrapper[4775]: I1216 15:29:07.075808 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dab0db60-d31f-4e9d-b17e-5dea1fdc90cb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f\" (UID: \"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f" Dec 16 15:29:07 crc kubenswrapper[4775]: I1216 15:29:07.081635 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dab0db60-d31f-4e9d-b17e-5dea1fdc90cb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f\" (UID: \"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f" Dec 16 15:29:07 crc kubenswrapper[4775]: I1216 15:29:07.083770 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dab0db60-d31f-4e9d-b17e-5dea1fdc90cb-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f\" (UID: \"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f" Dec 16 15:29:07 crc kubenswrapper[4775]: I1216 15:29:07.101407 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdtmm\" (UniqueName: \"kubernetes.io/projected/dab0db60-d31f-4e9d-b17e-5dea1fdc90cb-kube-api-access-zdtmm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f\" (UID: \"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f" Dec 16 15:29:07 crc kubenswrapper[4775]: I1216 15:29:07.248520 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f" Dec 16 15:29:07 crc kubenswrapper[4775]: I1216 15:29:07.747153 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f"] Dec 16 15:29:07 crc kubenswrapper[4775]: W1216 15:29:07.751500 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddab0db60_d31f_4e9d_b17e_5dea1fdc90cb.slice/crio-667965952bc28e6ab60bb978390afa72c851ce36d915abb6cae8ee73a167e4ae WatchSource:0}: Error finding container 667965952bc28e6ab60bb978390afa72c851ce36d915abb6cae8ee73a167e4ae: Status 404 returned error can't find the container with id 667965952bc28e6ab60bb978390afa72c851ce36d915abb6cae8ee73a167e4ae Dec 16 15:29:07 crc kubenswrapper[4775]: I1216 15:29:07.754475 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 15:29:07 crc kubenswrapper[4775]: I1216 15:29:07.831316 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f" event={"ID":"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb","Type":"ContainerStarted","Data":"667965952bc28e6ab60bb978390afa72c851ce36d915abb6cae8ee73a167e4ae"} Dec 16 15:29:08 crc kubenswrapper[4775]: I1216 15:29:08.844490 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f" event={"ID":"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb","Type":"ContainerStarted","Data":"1616fb13d5f553a485d6c32cf69b0a613fdd0fac7cd516e9229ba6f9b73a35f8"} Dec 16 15:29:08 crc kubenswrapper[4775]: I1216 15:29:08.864348 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f" podStartSLOduration=2.219861303 podStartE2EDuration="2.8643302s" podCreationTimestamp="2025-12-16 15:29:06 +0000 UTC" firstStartedPulling="2025-12-16 15:29:07.754185772 +0000 UTC m=+2072.705264695" lastFinishedPulling="2025-12-16 15:29:08.398654669 +0000 UTC m=+2073.349733592" observedRunningTime="2025-12-16 15:29:08.860010305 +0000 UTC m=+2073.811089248" watchObservedRunningTime="2025-12-16 15:29:08.8643302 +0000 UTC m=+2073.815409123" Dec 16 15:29:18 crc kubenswrapper[4775]: E1216 15:29:18.757190 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddab0db60_d31f_4e9d_b17e_5dea1fdc90cb.slice/crio-1616fb13d5f553a485d6c32cf69b0a613fdd0fac7cd516e9229ba6f9b73a35f8.scope\": RecentStats: unable to find data in memory cache]" Dec 16 15:29:18 crc kubenswrapper[4775]: I1216 15:29:18.952789 4775 generic.go:334] "Generic (PLEG): container finished" podID="dab0db60-d31f-4e9d-b17e-5dea1fdc90cb" containerID="1616fb13d5f553a485d6c32cf69b0a613fdd0fac7cd516e9229ba6f9b73a35f8" exitCode=0 Dec 16 15:29:18 crc kubenswrapper[4775]: I1216 15:29:18.952868 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f" event={"ID":"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb","Type":"ContainerDied","Data":"1616fb13d5f553a485d6c32cf69b0a613fdd0fac7cd516e9229ba6f9b73a35f8"} Dec 16 15:29:20 crc kubenswrapper[4775]: I1216 15:29:20.385054 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f" Dec 16 15:29:20 crc kubenswrapper[4775]: I1216 15:29:20.485826 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dab0db60-d31f-4e9d-b17e-5dea1fdc90cb-ssh-key\") pod \"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb\" (UID: \"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb\") " Dec 16 15:29:20 crc kubenswrapper[4775]: I1216 15:29:20.485908 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dab0db60-d31f-4e9d-b17e-5dea1fdc90cb-inventory\") pod \"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb\" (UID: \"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb\") " Dec 16 15:29:20 crc kubenswrapper[4775]: I1216 15:29:20.486171 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdtmm\" (UniqueName: \"kubernetes.io/projected/dab0db60-d31f-4e9d-b17e-5dea1fdc90cb-kube-api-access-zdtmm\") pod \"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb\" (UID: \"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb\") " Dec 16 15:29:20 crc kubenswrapper[4775]: I1216 15:29:20.491129 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab0db60-d31f-4e9d-b17e-5dea1fdc90cb-kube-api-access-zdtmm" (OuterVolumeSpecName: "kube-api-access-zdtmm") pod "dab0db60-d31f-4e9d-b17e-5dea1fdc90cb" (UID: "dab0db60-d31f-4e9d-b17e-5dea1fdc90cb"). InnerVolumeSpecName "kube-api-access-zdtmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:29:20 crc kubenswrapper[4775]: I1216 15:29:20.510638 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab0db60-d31f-4e9d-b17e-5dea1fdc90cb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dab0db60-d31f-4e9d-b17e-5dea1fdc90cb" (UID: "dab0db60-d31f-4e9d-b17e-5dea1fdc90cb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:29:20 crc kubenswrapper[4775]: I1216 15:29:20.510665 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab0db60-d31f-4e9d-b17e-5dea1fdc90cb-inventory" (OuterVolumeSpecName: "inventory") pod "dab0db60-d31f-4e9d-b17e-5dea1fdc90cb" (UID: "dab0db60-d31f-4e9d-b17e-5dea1fdc90cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:29:20 crc kubenswrapper[4775]: I1216 15:29:20.588353 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dab0db60-d31f-4e9d-b17e-5dea1fdc90cb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:20 crc kubenswrapper[4775]: I1216 15:29:20.588428 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dab0db60-d31f-4e9d-b17e-5dea1fdc90cb-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:20 crc kubenswrapper[4775]: I1216 15:29:20.588443 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdtmm\" (UniqueName: \"kubernetes.io/projected/dab0db60-d31f-4e9d-b17e-5dea1fdc90cb-kube-api-access-zdtmm\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:20 crc kubenswrapper[4775]: I1216 15:29:20.971682 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f" event={"ID":"dab0db60-d31f-4e9d-b17e-5dea1fdc90cb","Type":"ContainerDied","Data":"667965952bc28e6ab60bb978390afa72c851ce36d915abb6cae8ee73a167e4ae"} Dec 16 15:29:20 crc kubenswrapper[4775]: I1216 15:29:20.971730 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="667965952bc28e6ab60bb978390afa72c851ce36d915abb6cae8ee73a167e4ae" Dec 16 15:29:20 crc kubenswrapper[4775]: I1216 15:29:20.971794 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.071715 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb"] Dec 16 15:29:21 crc kubenswrapper[4775]: E1216 15:29:21.072188 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab0db60-d31f-4e9d-b17e-5dea1fdc90cb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.072245 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab0db60-d31f-4e9d-b17e-5dea1fdc90cb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.072542 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab0db60-d31f-4e9d-b17e-5dea1fdc90cb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.073151 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.075193 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.075390 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.075518 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.075674 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.075716 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.075733 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tgv5f" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.075684 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.076067 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.086781 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb"] Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.109838 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.110032 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.110069 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rftjd\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-kube-api-access-rftjd\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.110110 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.110156 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.110210 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.110277 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.110301 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.110338 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.110384 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.110412 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.110467 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.110508 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.110532 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.211767 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.211827 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rftjd\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-kube-api-access-rftjd\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.211862 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.211914 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.211953 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.211996 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.212022 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.212052 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.212081 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.212102 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.212144 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.212170 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.212190 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.212251 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.216426 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.216687 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.216579 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.217216 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.217485 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.217619 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.217649 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.218372 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.219106 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.219299 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.219470 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.220521 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.226059 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.249719 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rftjd\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-kube-api-access-rftjd\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mxktb\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.393267 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.942922 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb"] Dec 16 15:29:21 crc kubenswrapper[4775]: W1216 15:29:21.943027 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f07dc8f_f161_4826_b191_4344f1b741e0.slice/crio-8647fcf52373b99918c65fe82bf908759cde8788fcee0790127f067c1a8f9a36 WatchSource:0}: Error finding container 8647fcf52373b99918c65fe82bf908759cde8788fcee0790127f067c1a8f9a36: Status 404 returned error can't find the container with id 8647fcf52373b99918c65fe82bf908759cde8788fcee0790127f067c1a8f9a36 Dec 16 15:29:21 crc kubenswrapper[4775]: I1216 15:29:21.981340 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" event={"ID":"5f07dc8f-f161-4826-b191-4344f1b741e0","Type":"ContainerStarted","Data":"8647fcf52373b99918c65fe82bf908759cde8788fcee0790127f067c1a8f9a36"} Dec 16 15:29:22 crc kubenswrapper[4775]: I1216 15:29:22.992531 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" event={"ID":"5f07dc8f-f161-4826-b191-4344f1b741e0","Type":"ContainerStarted","Data":"652d34a39426ea629b934980dc9a7a2df117837dc147ee0c484bae9266e8d476"} Dec 16 15:29:23 crc kubenswrapper[4775]: I1216 15:29:23.015379 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" podStartSLOduration=1.473076166 podStartE2EDuration="2.015354306s" podCreationTimestamp="2025-12-16 15:29:21 +0000 UTC" firstStartedPulling="2025-12-16 15:29:21.945583792 +0000 UTC m=+2086.896662715" lastFinishedPulling="2025-12-16 15:29:22.487861912 +0000 UTC m=+2087.438940855" observedRunningTime="2025-12-16 15:29:23.010386158 +0000 UTC m=+2087.961465121" watchObservedRunningTime="2025-12-16 15:29:23.015354306 +0000 UTC m=+2087.966433249" Dec 16 15:30:00 crc kubenswrapper[4775]: I1216 15:30:00.149646 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj"] Dec 16 15:30:00 crc kubenswrapper[4775]: I1216 15:30:00.151733 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj" Dec 16 15:30:00 crc kubenswrapper[4775]: I1216 15:30:00.154023 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 15:30:00 crc kubenswrapper[4775]: I1216 15:30:00.154128 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 15:30:00 crc kubenswrapper[4775]: I1216 15:30:00.161703 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj"] Dec 16 15:30:00 crc kubenswrapper[4775]: I1216 15:30:00.302297 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsnnz\" (UniqueName: \"kubernetes.io/projected/7dd869ac-017f-41de-be03-cf9afde344e8-kube-api-access-jsnnz\") pod \"collect-profiles-29431650-w5jlj\" (UID: \"7dd869ac-017f-41de-be03-cf9afde344e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj" Dec 16 15:30:00 crc kubenswrapper[4775]: I1216 15:30:00.302476 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7dd869ac-017f-41de-be03-cf9afde344e8-secret-volume\") pod \"collect-profiles-29431650-w5jlj\" (UID: \"7dd869ac-017f-41de-be03-cf9afde344e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj" Dec 16 15:30:00 crc kubenswrapper[4775]: I1216 15:30:00.302522 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7dd869ac-017f-41de-be03-cf9afde344e8-config-volume\") pod \"collect-profiles-29431650-w5jlj\" (UID: \"7dd869ac-017f-41de-be03-cf9afde344e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj" Dec 16 15:30:00 crc kubenswrapper[4775]: I1216 15:30:00.404429 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsnnz\" (UniqueName: \"kubernetes.io/projected/7dd869ac-017f-41de-be03-cf9afde344e8-kube-api-access-jsnnz\") pod \"collect-profiles-29431650-w5jlj\" (UID: \"7dd869ac-017f-41de-be03-cf9afde344e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj" Dec 16 15:30:00 crc kubenswrapper[4775]: I1216 15:30:00.404507 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7dd869ac-017f-41de-be03-cf9afde344e8-secret-volume\") pod \"collect-profiles-29431650-w5jlj\" (UID: \"7dd869ac-017f-41de-be03-cf9afde344e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj" Dec 16 15:30:00 crc kubenswrapper[4775]: I1216 15:30:00.404533 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7dd869ac-017f-41de-be03-cf9afde344e8-config-volume\") pod \"collect-profiles-29431650-w5jlj\" (UID: \"7dd869ac-017f-41de-be03-cf9afde344e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj" Dec 16 15:30:00 crc kubenswrapper[4775]: I1216 15:30:00.405465 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7dd869ac-017f-41de-be03-cf9afde344e8-config-volume\") pod \"collect-profiles-29431650-w5jlj\" (UID: \"7dd869ac-017f-41de-be03-cf9afde344e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj" Dec 16 15:30:00 crc kubenswrapper[4775]: I1216 15:30:00.410976 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7dd869ac-017f-41de-be03-cf9afde344e8-secret-volume\") pod \"collect-profiles-29431650-w5jlj\" (UID: \"7dd869ac-017f-41de-be03-cf9afde344e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj" Dec 16 15:30:00 crc kubenswrapper[4775]: I1216 15:30:00.420757 4775 generic.go:334] "Generic (PLEG): container finished" podID="5f07dc8f-f161-4826-b191-4344f1b741e0" containerID="652d34a39426ea629b934980dc9a7a2df117837dc147ee0c484bae9266e8d476" exitCode=0 Dec 16 15:30:00 crc kubenswrapper[4775]: I1216 15:30:00.420808 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" event={"ID":"5f07dc8f-f161-4826-b191-4344f1b741e0","Type":"ContainerDied","Data":"652d34a39426ea629b934980dc9a7a2df117837dc147ee0c484bae9266e8d476"} Dec 16 15:30:00 crc kubenswrapper[4775]: I1216 15:30:00.435625 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsnnz\" (UniqueName: \"kubernetes.io/projected/7dd869ac-017f-41de-be03-cf9afde344e8-kube-api-access-jsnnz\") pod \"collect-profiles-29431650-w5jlj\" (UID: \"7dd869ac-017f-41de-be03-cf9afde344e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj" Dec 16 15:30:00 crc kubenswrapper[4775]: I1216 15:30:00.479467 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj" Dec 16 15:30:00 crc kubenswrapper[4775]: I1216 15:30:00.967784 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj"] Dec 16 15:30:00 crc kubenswrapper[4775]: W1216 15:30:00.973447 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dd869ac_017f_41de_be03_cf9afde344e8.slice/crio-000c4b4c1e9a6fa21fa1cde34b23b9ef860361f6072bbef8dbbfd02e35f19157 WatchSource:0}: Error finding container 000c4b4c1e9a6fa21fa1cde34b23b9ef860361f6072bbef8dbbfd02e35f19157: Status 404 returned error can't find the container with id 000c4b4c1e9a6fa21fa1cde34b23b9ef860361f6072bbef8dbbfd02e35f19157 Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.430958 4775 generic.go:334] "Generic (PLEG): container finished" podID="7dd869ac-017f-41de-be03-cf9afde344e8" containerID="f020708284c0c9fd1db186f57435aca9d59040e146409fcfc3f8304d2be60dcb" exitCode=0 Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.431619 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj" event={"ID":"7dd869ac-017f-41de-be03-cf9afde344e8","Type":"ContainerDied","Data":"f020708284c0c9fd1db186f57435aca9d59040e146409fcfc3f8304d2be60dcb"} Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.431676 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj" event={"ID":"7dd869ac-017f-41de-be03-cf9afde344e8","Type":"ContainerStarted","Data":"000c4b4c1e9a6fa21fa1cde34b23b9ef860361f6072bbef8dbbfd02e35f19157"} Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.858093 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.946576 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"5f07dc8f-f161-4826-b191-4344f1b741e0\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.946871 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-nova-combined-ca-bundle\") pod \"5f07dc8f-f161-4826-b191-4344f1b741e0\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.946937 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-inventory\") pod \"5f07dc8f-f161-4826-b191-4344f1b741e0\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.947031 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"5f07dc8f-f161-4826-b191-4344f1b741e0\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.947068 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-repo-setup-combined-ca-bundle\") pod \"5f07dc8f-f161-4826-b191-4344f1b741e0\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.947118 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rftjd\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-kube-api-access-rftjd\") pod \"5f07dc8f-f161-4826-b191-4344f1b741e0\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.947457 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-bootstrap-combined-ca-bundle\") pod \"5f07dc8f-f161-4826-b191-4344f1b741e0\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.947518 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-ovn-combined-ca-bundle\") pod \"5f07dc8f-f161-4826-b191-4344f1b741e0\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.947553 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-neutron-metadata-combined-ca-bundle\") pod \"5f07dc8f-f161-4826-b191-4344f1b741e0\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.947643 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-libvirt-combined-ca-bundle\") pod \"5f07dc8f-f161-4826-b191-4344f1b741e0\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.947677 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-ssh-key\") pod \"5f07dc8f-f161-4826-b191-4344f1b741e0\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.947709 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-telemetry-combined-ca-bundle\") pod \"5f07dc8f-f161-4826-b191-4344f1b741e0\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.947779 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"5f07dc8f-f161-4826-b191-4344f1b741e0\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.947806 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"5f07dc8f-f161-4826-b191-4344f1b741e0\" (UID: \"5f07dc8f-f161-4826-b191-4344f1b741e0\") " Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.954062 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5f07dc8f-f161-4826-b191-4344f1b741e0" (UID: "5f07dc8f-f161-4826-b191-4344f1b741e0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.954069 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "5f07dc8f-f161-4826-b191-4344f1b741e0" (UID: "5f07dc8f-f161-4826-b191-4344f1b741e0"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.954528 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5f07dc8f-f161-4826-b191-4344f1b741e0" (UID: "5f07dc8f-f161-4826-b191-4344f1b741e0"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.955696 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "5f07dc8f-f161-4826-b191-4344f1b741e0" (UID: "5f07dc8f-f161-4826-b191-4344f1b741e0"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.955841 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "5f07dc8f-f161-4826-b191-4344f1b741e0" (UID: "5f07dc8f-f161-4826-b191-4344f1b741e0"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.956242 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "5f07dc8f-f161-4826-b191-4344f1b741e0" (UID: "5f07dc8f-f161-4826-b191-4344f1b741e0"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.956356 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "5f07dc8f-f161-4826-b191-4344f1b741e0" (UID: "5f07dc8f-f161-4826-b191-4344f1b741e0"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.956584 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-kube-api-access-rftjd" (OuterVolumeSpecName: "kube-api-access-rftjd") pod "5f07dc8f-f161-4826-b191-4344f1b741e0" (UID: "5f07dc8f-f161-4826-b191-4344f1b741e0"). InnerVolumeSpecName "kube-api-access-rftjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.957956 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "5f07dc8f-f161-4826-b191-4344f1b741e0" (UID: "5f07dc8f-f161-4826-b191-4344f1b741e0"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.958427 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "5f07dc8f-f161-4826-b191-4344f1b741e0" (UID: "5f07dc8f-f161-4826-b191-4344f1b741e0"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.959356 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "5f07dc8f-f161-4826-b191-4344f1b741e0" (UID: "5f07dc8f-f161-4826-b191-4344f1b741e0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.968087 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "5f07dc8f-f161-4826-b191-4344f1b741e0" (UID: "5f07dc8f-f161-4826-b191-4344f1b741e0"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.983713 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-inventory" (OuterVolumeSpecName: "inventory") pod "5f07dc8f-f161-4826-b191-4344f1b741e0" (UID: "5f07dc8f-f161-4826-b191-4344f1b741e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:30:01 crc kubenswrapper[4775]: I1216 15:30:01.985078 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5f07dc8f-f161-4826-b191-4344f1b741e0" (UID: "5f07dc8f-f161-4826-b191-4344f1b741e0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.050602 4775 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.050655 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.050667 4775 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.050685 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.050699 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.050711 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.050723 4775 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.050734 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.050745 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.050759 4775 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.050771 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rftjd\" (UniqueName: \"kubernetes.io/projected/5f07dc8f-f161-4826-b191-4344f1b741e0-kube-api-access-rftjd\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.050783 4775 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.050794 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.050805 4775 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f07dc8f-f161-4826-b191-4344f1b741e0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.195023 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vsfkp"] Dec 16 15:30:02 crc kubenswrapper[4775]: E1216 15:30:02.195597 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f07dc8f-f161-4826-b191-4344f1b741e0" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.195621 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f07dc8f-f161-4826-b191-4344f1b741e0" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.195844 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f07dc8f-f161-4826-b191-4344f1b741e0" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.198095 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vsfkp" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.206862 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vsfkp"] Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.360386 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84h47\" (UniqueName: \"kubernetes.io/projected/45cea987-db5b-4cb3-944d-b242b75580b3-kube-api-access-84h47\") pod \"redhat-marketplace-vsfkp\" (UID: \"45cea987-db5b-4cb3-944d-b242b75580b3\") " pod="openshift-marketplace/redhat-marketplace-vsfkp" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.360595 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45cea987-db5b-4cb3-944d-b242b75580b3-catalog-content\") pod \"redhat-marketplace-vsfkp\" (UID: \"45cea987-db5b-4cb3-944d-b242b75580b3\") " pod="openshift-marketplace/redhat-marketplace-vsfkp" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.360777 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45cea987-db5b-4cb3-944d-b242b75580b3-utilities\") pod \"redhat-marketplace-vsfkp\" (UID: \"45cea987-db5b-4cb3-944d-b242b75580b3\") " pod="openshift-marketplace/redhat-marketplace-vsfkp" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.442053 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" event={"ID":"5f07dc8f-f161-4826-b191-4344f1b741e0","Type":"ContainerDied","Data":"8647fcf52373b99918c65fe82bf908759cde8788fcee0790127f067c1a8f9a36"} Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.442122 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8647fcf52373b99918c65fe82bf908759cde8788fcee0790127f067c1a8f9a36" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.442076 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mxktb" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.463246 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45cea987-db5b-4cb3-944d-b242b75580b3-utilities\") pod \"redhat-marketplace-vsfkp\" (UID: \"45cea987-db5b-4cb3-944d-b242b75580b3\") " pod="openshift-marketplace/redhat-marketplace-vsfkp" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.463448 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84h47\" (UniqueName: \"kubernetes.io/projected/45cea987-db5b-4cb3-944d-b242b75580b3-kube-api-access-84h47\") pod \"redhat-marketplace-vsfkp\" (UID: \"45cea987-db5b-4cb3-944d-b242b75580b3\") " pod="openshift-marketplace/redhat-marketplace-vsfkp" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.463560 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45cea987-db5b-4cb3-944d-b242b75580b3-catalog-content\") pod \"redhat-marketplace-vsfkp\" (UID: \"45cea987-db5b-4cb3-944d-b242b75580b3\") " pod="openshift-marketplace/redhat-marketplace-vsfkp" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.464201 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45cea987-db5b-4cb3-944d-b242b75580b3-utilities\") pod \"redhat-marketplace-vsfkp\" (UID: \"45cea987-db5b-4cb3-944d-b242b75580b3\") " pod="openshift-marketplace/redhat-marketplace-vsfkp" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.464549 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45cea987-db5b-4cb3-944d-b242b75580b3-catalog-content\") pod \"redhat-marketplace-vsfkp\" (UID: \"45cea987-db5b-4cb3-944d-b242b75580b3\") " pod="openshift-marketplace/redhat-marketplace-vsfkp" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.493106 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84h47\" (UniqueName: \"kubernetes.io/projected/45cea987-db5b-4cb3-944d-b242b75580b3-kube-api-access-84h47\") pod \"redhat-marketplace-vsfkp\" (UID: \"45cea987-db5b-4cb3-944d-b242b75580b3\") " pod="openshift-marketplace/redhat-marketplace-vsfkp" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.522326 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vsfkp" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.548916 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b"] Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.550097 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.555904 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.555970 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tgv5f" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.556082 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.556196 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.556224 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.565477 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b"] Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.675627 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fdz4b\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.676034 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fdz4b\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.676097 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4drf\" (UniqueName: \"kubernetes.io/projected/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-kube-api-access-v4drf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fdz4b\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.676189 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fdz4b\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.676215 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fdz4b\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.787193 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fdz4b\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.787277 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4drf\" (UniqueName: \"kubernetes.io/projected/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-kube-api-access-v4drf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fdz4b\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.787359 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fdz4b\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.788447 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fdz4b\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.787426 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fdz4b\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.789027 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fdz4b\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.795169 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fdz4b\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.796786 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fdz4b\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.799415 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fdz4b\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.818839 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4drf\" (UniqueName: \"kubernetes.io/projected/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-kube-api-access-v4drf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fdz4b\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.885144 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.885237 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.933286 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.962351 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:30:02 crc kubenswrapper[4775]: I1216 15:30:02.977205 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vsfkp"] Dec 16 15:30:03 crc kubenswrapper[4775]: I1216 15:30:03.106739 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsnnz\" (UniqueName: \"kubernetes.io/projected/7dd869ac-017f-41de-be03-cf9afde344e8-kube-api-access-jsnnz\") pod \"7dd869ac-017f-41de-be03-cf9afde344e8\" (UID: \"7dd869ac-017f-41de-be03-cf9afde344e8\") " Dec 16 15:30:03 crc kubenswrapper[4775]: I1216 15:30:03.107711 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7dd869ac-017f-41de-be03-cf9afde344e8-config-volume\") pod \"7dd869ac-017f-41de-be03-cf9afde344e8\" (UID: \"7dd869ac-017f-41de-be03-cf9afde344e8\") " Dec 16 15:30:03 crc kubenswrapper[4775]: I1216 15:30:03.108083 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7dd869ac-017f-41de-be03-cf9afde344e8-secret-volume\") pod \"7dd869ac-017f-41de-be03-cf9afde344e8\" (UID: \"7dd869ac-017f-41de-be03-cf9afde344e8\") " Dec 16 15:30:03 crc kubenswrapper[4775]: I1216 15:30:03.108938 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dd869ac-017f-41de-be03-cf9afde344e8-config-volume" (OuterVolumeSpecName: "config-volume") pod "7dd869ac-017f-41de-be03-cf9afde344e8" (UID: "7dd869ac-017f-41de-be03-cf9afde344e8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:30:03 crc kubenswrapper[4775]: I1216 15:30:03.114228 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dd869ac-017f-41de-be03-cf9afde344e8-kube-api-access-jsnnz" (OuterVolumeSpecName: "kube-api-access-jsnnz") pod "7dd869ac-017f-41de-be03-cf9afde344e8" (UID: "7dd869ac-017f-41de-be03-cf9afde344e8"). InnerVolumeSpecName "kube-api-access-jsnnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:30:03 crc kubenswrapper[4775]: I1216 15:30:03.115513 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd869ac-017f-41de-be03-cf9afde344e8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7dd869ac-017f-41de-be03-cf9afde344e8" (UID: "7dd869ac-017f-41de-be03-cf9afde344e8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:30:03 crc kubenswrapper[4775]: I1216 15:30:03.210818 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7dd869ac-017f-41de-be03-cf9afde344e8-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:03 crc kubenswrapper[4775]: I1216 15:30:03.210847 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsnnz\" (UniqueName: \"kubernetes.io/projected/7dd869ac-017f-41de-be03-cf9afde344e8-kube-api-access-jsnnz\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:03 crc kubenswrapper[4775]: I1216 15:30:03.210856 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7dd869ac-017f-41de-be03-cf9afde344e8-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:03 crc kubenswrapper[4775]: I1216 15:30:03.450960 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj" event={"ID":"7dd869ac-017f-41de-be03-cf9afde344e8","Type":"ContainerDied","Data":"000c4b4c1e9a6fa21fa1cde34b23b9ef860361f6072bbef8dbbfd02e35f19157"} Dec 16 15:30:03 crc kubenswrapper[4775]: I1216 15:30:03.451294 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="000c4b4c1e9a6fa21fa1cde34b23b9ef860361f6072bbef8dbbfd02e35f19157" Dec 16 15:30:03 crc kubenswrapper[4775]: I1216 15:30:03.450971 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-w5jlj" Dec 16 15:30:03 crc kubenswrapper[4775]: I1216 15:30:03.452485 4775 generic.go:334] "Generic (PLEG): container finished" podID="45cea987-db5b-4cb3-944d-b242b75580b3" containerID="96a422fb1fe876f582745d467614c8862f6aab1b50db727da2225325a8b06b6f" exitCode=0 Dec 16 15:30:03 crc kubenswrapper[4775]: I1216 15:30:03.452531 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vsfkp" event={"ID":"45cea987-db5b-4cb3-944d-b242b75580b3","Type":"ContainerDied","Data":"96a422fb1fe876f582745d467614c8862f6aab1b50db727da2225325a8b06b6f"} Dec 16 15:30:03 crc kubenswrapper[4775]: I1216 15:30:03.452561 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vsfkp" event={"ID":"45cea987-db5b-4cb3-944d-b242b75580b3","Type":"ContainerStarted","Data":"5f6137c822b33373fe9a997dc24643d24d26ca4b774cce1ed0bd4aab4912cc7d"} Dec 16 15:30:03 crc kubenswrapper[4775]: I1216 15:30:03.607395 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b"] Dec 16 15:30:03 crc kubenswrapper[4775]: W1216 15:30:03.610201 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ac16fe7_1c6b_49c8_a9d2_97db6fa4dc36.slice/crio-d4de0f01042bfead2991fbbeb40a1a9751d0b1268fc219b178ed797945492953 WatchSource:0}: Error finding container d4de0f01042bfead2991fbbeb40a1a9751d0b1268fc219b178ed797945492953: Status 404 returned error can't find the container with id d4de0f01042bfead2991fbbeb40a1a9751d0b1268fc219b178ed797945492953 Dec 16 15:30:04 crc kubenswrapper[4775]: I1216 15:30:04.018337 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455"] Dec 16 15:30:04 crc kubenswrapper[4775]: I1216 15:30:04.026609 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431605-6v455"] Dec 16 15:30:04 crc kubenswrapper[4775]: I1216 15:30:04.462957 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" event={"ID":"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36","Type":"ContainerStarted","Data":"d4de0f01042bfead2991fbbeb40a1a9751d0b1268fc219b178ed797945492953"} Dec 16 15:30:05 crc kubenswrapper[4775]: I1216 15:30:05.354759 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1734b68-7b3d-49f0-9398-879da24fa19d" path="/var/lib/kubelet/pods/a1734b68-7b3d-49f0-9398-879da24fa19d/volumes" Dec 16 15:30:05 crc kubenswrapper[4775]: I1216 15:30:05.474073 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" event={"ID":"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36","Type":"ContainerStarted","Data":"ebd2a4fa157ab230754a5db7128f86e015b7f331883eaed846273291e66de288"} Dec 16 15:30:05 crc kubenswrapper[4775]: I1216 15:30:05.498433 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" podStartSLOduration=2.5463506000000002 podStartE2EDuration="3.498410878s" podCreationTimestamp="2025-12-16 15:30:02 +0000 UTC" firstStartedPulling="2025-12-16 15:30:03.612593159 +0000 UTC m=+2128.563672082" lastFinishedPulling="2025-12-16 15:30:04.564653437 +0000 UTC m=+2129.515732360" observedRunningTime="2025-12-16 15:30:05.491979114 +0000 UTC m=+2130.443058047" watchObservedRunningTime="2025-12-16 15:30:05.498410878 +0000 UTC m=+2130.449489801" Dec 16 15:30:06 crc kubenswrapper[4775]: I1216 15:30:06.496007 4775 generic.go:334] "Generic (PLEG): container finished" podID="45cea987-db5b-4cb3-944d-b242b75580b3" containerID="f11e5d52acf5330ff4b9ca5dd8b1cee1b16de157fdb2ee1e3b2b07350dd1d59b" exitCode=0 Dec 16 15:30:06 crc kubenswrapper[4775]: I1216 15:30:06.496105 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vsfkp" event={"ID":"45cea987-db5b-4cb3-944d-b242b75580b3","Type":"ContainerDied","Data":"f11e5d52acf5330ff4b9ca5dd8b1cee1b16de157fdb2ee1e3b2b07350dd1d59b"} Dec 16 15:30:07 crc kubenswrapper[4775]: I1216 15:30:07.508739 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vsfkp" event={"ID":"45cea987-db5b-4cb3-944d-b242b75580b3","Type":"ContainerStarted","Data":"59b8bcc22ac61b0b296fc6bf91b76ede2f4dc9adc35c6f547ff34cd6fabbfc29"} Dec 16 15:30:07 crc kubenswrapper[4775]: I1216 15:30:07.537567 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vsfkp" podStartSLOduration=1.899204359 podStartE2EDuration="5.537542475s" podCreationTimestamp="2025-12-16 15:30:02 +0000 UTC" firstStartedPulling="2025-12-16 15:30:03.454937273 +0000 UTC m=+2128.406016196" lastFinishedPulling="2025-12-16 15:30:07.093275389 +0000 UTC m=+2132.044354312" observedRunningTime="2025-12-16 15:30:07.535226302 +0000 UTC m=+2132.486305215" watchObservedRunningTime="2025-12-16 15:30:07.537542475 +0000 UTC m=+2132.488621418" Dec 16 15:30:12 crc kubenswrapper[4775]: I1216 15:30:12.523961 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vsfkp" Dec 16 15:30:12 crc kubenswrapper[4775]: I1216 15:30:12.524629 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vsfkp" Dec 16 15:30:12 crc kubenswrapper[4775]: I1216 15:30:12.569398 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vsfkp" Dec 16 15:30:13 crc kubenswrapper[4775]: I1216 15:30:13.616506 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vsfkp" Dec 16 15:30:13 crc kubenswrapper[4775]: I1216 15:30:13.666214 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vsfkp"] Dec 16 15:30:15 crc kubenswrapper[4775]: I1216 15:30:15.810060 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vsfkp" podUID="45cea987-db5b-4cb3-944d-b242b75580b3" containerName="registry-server" containerID="cri-o://59b8bcc22ac61b0b296fc6bf91b76ede2f4dc9adc35c6f547ff34cd6fabbfc29" gracePeriod=2 Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.250061 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vsfkp" Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.412783 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45cea987-db5b-4cb3-944d-b242b75580b3-utilities\") pod \"45cea987-db5b-4cb3-944d-b242b75580b3\" (UID: \"45cea987-db5b-4cb3-944d-b242b75580b3\") " Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.413034 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84h47\" (UniqueName: \"kubernetes.io/projected/45cea987-db5b-4cb3-944d-b242b75580b3-kube-api-access-84h47\") pod \"45cea987-db5b-4cb3-944d-b242b75580b3\" (UID: \"45cea987-db5b-4cb3-944d-b242b75580b3\") " Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.413122 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45cea987-db5b-4cb3-944d-b242b75580b3-catalog-content\") pod \"45cea987-db5b-4cb3-944d-b242b75580b3\" (UID: \"45cea987-db5b-4cb3-944d-b242b75580b3\") " Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.413926 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45cea987-db5b-4cb3-944d-b242b75580b3-utilities" (OuterVolumeSpecName: "utilities") pod "45cea987-db5b-4cb3-944d-b242b75580b3" (UID: "45cea987-db5b-4cb3-944d-b242b75580b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.420229 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45cea987-db5b-4cb3-944d-b242b75580b3-kube-api-access-84h47" (OuterVolumeSpecName: "kube-api-access-84h47") pod "45cea987-db5b-4cb3-944d-b242b75580b3" (UID: "45cea987-db5b-4cb3-944d-b242b75580b3"). InnerVolumeSpecName "kube-api-access-84h47". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.435026 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45cea987-db5b-4cb3-944d-b242b75580b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45cea987-db5b-4cb3-944d-b242b75580b3" (UID: "45cea987-db5b-4cb3-944d-b242b75580b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.516358 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84h47\" (UniqueName: \"kubernetes.io/projected/45cea987-db5b-4cb3-944d-b242b75580b3-kube-api-access-84h47\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.516679 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45cea987-db5b-4cb3-944d-b242b75580b3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.516706 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45cea987-db5b-4cb3-944d-b242b75580b3-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.828223 4775 generic.go:334] "Generic (PLEG): container finished" podID="45cea987-db5b-4cb3-944d-b242b75580b3" containerID="59b8bcc22ac61b0b296fc6bf91b76ede2f4dc9adc35c6f547ff34cd6fabbfc29" exitCode=0 Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.828262 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vsfkp" event={"ID":"45cea987-db5b-4cb3-944d-b242b75580b3","Type":"ContainerDied","Data":"59b8bcc22ac61b0b296fc6bf91b76ede2f4dc9adc35c6f547ff34cd6fabbfc29"} Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.828594 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vsfkp" event={"ID":"45cea987-db5b-4cb3-944d-b242b75580b3","Type":"ContainerDied","Data":"5f6137c822b33373fe9a997dc24643d24d26ca4b774cce1ed0bd4aab4912cc7d"} Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.828617 4775 scope.go:117] "RemoveContainer" containerID="59b8bcc22ac61b0b296fc6bf91b76ede2f4dc9adc35c6f547ff34cd6fabbfc29" Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.828313 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vsfkp" Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.904129 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vsfkp"] Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.909321 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vsfkp"] Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.916002 4775 scope.go:117] "RemoveContainer" containerID="f11e5d52acf5330ff4b9ca5dd8b1cee1b16de157fdb2ee1e3b2b07350dd1d59b" Dec 16 15:30:16 crc kubenswrapper[4775]: I1216 15:30:16.954717 4775 scope.go:117] "RemoveContainer" containerID="96a422fb1fe876f582745d467614c8862f6aab1b50db727da2225325a8b06b6f" Dec 16 15:30:17 crc kubenswrapper[4775]: I1216 15:30:17.000596 4775 scope.go:117] "RemoveContainer" containerID="59b8bcc22ac61b0b296fc6bf91b76ede2f4dc9adc35c6f547ff34cd6fabbfc29" Dec 16 15:30:17 crc kubenswrapper[4775]: E1216 15:30:17.001186 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59b8bcc22ac61b0b296fc6bf91b76ede2f4dc9adc35c6f547ff34cd6fabbfc29\": container with ID starting with 59b8bcc22ac61b0b296fc6bf91b76ede2f4dc9adc35c6f547ff34cd6fabbfc29 not found: ID does not exist" containerID="59b8bcc22ac61b0b296fc6bf91b76ede2f4dc9adc35c6f547ff34cd6fabbfc29" Dec 16 15:30:17 crc kubenswrapper[4775]: I1216 15:30:17.001234 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59b8bcc22ac61b0b296fc6bf91b76ede2f4dc9adc35c6f547ff34cd6fabbfc29"} err="failed to get container status \"59b8bcc22ac61b0b296fc6bf91b76ede2f4dc9adc35c6f547ff34cd6fabbfc29\": rpc error: code = NotFound desc = could not find container \"59b8bcc22ac61b0b296fc6bf91b76ede2f4dc9adc35c6f547ff34cd6fabbfc29\": container with ID starting with 59b8bcc22ac61b0b296fc6bf91b76ede2f4dc9adc35c6f547ff34cd6fabbfc29 not found: ID does not exist" Dec 16 15:30:17 crc kubenswrapper[4775]: I1216 15:30:17.001263 4775 scope.go:117] "RemoveContainer" containerID="f11e5d52acf5330ff4b9ca5dd8b1cee1b16de157fdb2ee1e3b2b07350dd1d59b" Dec 16 15:30:17 crc kubenswrapper[4775]: E1216 15:30:17.001766 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f11e5d52acf5330ff4b9ca5dd8b1cee1b16de157fdb2ee1e3b2b07350dd1d59b\": container with ID starting with f11e5d52acf5330ff4b9ca5dd8b1cee1b16de157fdb2ee1e3b2b07350dd1d59b not found: ID does not exist" containerID="f11e5d52acf5330ff4b9ca5dd8b1cee1b16de157fdb2ee1e3b2b07350dd1d59b" Dec 16 15:30:17 crc kubenswrapper[4775]: I1216 15:30:17.001797 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f11e5d52acf5330ff4b9ca5dd8b1cee1b16de157fdb2ee1e3b2b07350dd1d59b"} err="failed to get container status \"f11e5d52acf5330ff4b9ca5dd8b1cee1b16de157fdb2ee1e3b2b07350dd1d59b\": rpc error: code = NotFound desc = could not find container \"f11e5d52acf5330ff4b9ca5dd8b1cee1b16de157fdb2ee1e3b2b07350dd1d59b\": container with ID starting with f11e5d52acf5330ff4b9ca5dd8b1cee1b16de157fdb2ee1e3b2b07350dd1d59b not found: ID does not exist" Dec 16 15:30:17 crc kubenswrapper[4775]: I1216 15:30:17.001820 4775 scope.go:117] "RemoveContainer" containerID="96a422fb1fe876f582745d467614c8862f6aab1b50db727da2225325a8b06b6f" Dec 16 15:30:17 crc kubenswrapper[4775]: E1216 15:30:17.002237 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96a422fb1fe876f582745d467614c8862f6aab1b50db727da2225325a8b06b6f\": container with ID starting with 96a422fb1fe876f582745d467614c8862f6aab1b50db727da2225325a8b06b6f not found: ID does not exist" containerID="96a422fb1fe876f582745d467614c8862f6aab1b50db727da2225325a8b06b6f" Dec 16 15:30:17 crc kubenswrapper[4775]: I1216 15:30:17.002292 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96a422fb1fe876f582745d467614c8862f6aab1b50db727da2225325a8b06b6f"} err="failed to get container status \"96a422fb1fe876f582745d467614c8862f6aab1b50db727da2225325a8b06b6f\": rpc error: code = NotFound desc = could not find container \"96a422fb1fe876f582745d467614c8862f6aab1b50db727da2225325a8b06b6f\": container with ID starting with 96a422fb1fe876f582745d467614c8862f6aab1b50db727da2225325a8b06b6f not found: ID does not exist" Dec 16 15:30:17 crc kubenswrapper[4775]: I1216 15:30:17.347841 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45cea987-db5b-4cb3-944d-b242b75580b3" path="/var/lib/kubelet/pods/45cea987-db5b-4cb3-944d-b242b75580b3/volumes" Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.559929 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wjq89"] Dec 16 15:30:21 crc kubenswrapper[4775]: E1216 15:30:21.560866 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45cea987-db5b-4cb3-944d-b242b75580b3" containerName="extract-utilities" Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.560879 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="45cea987-db5b-4cb3-944d-b242b75580b3" containerName="extract-utilities" Dec 16 15:30:21 crc kubenswrapper[4775]: E1216 15:30:21.562116 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45cea987-db5b-4cb3-944d-b242b75580b3" containerName="registry-server" Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.562141 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="45cea987-db5b-4cb3-944d-b242b75580b3" containerName="registry-server" Dec 16 15:30:21 crc kubenswrapper[4775]: E1216 15:30:21.562159 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd869ac-017f-41de-be03-cf9afde344e8" containerName="collect-profiles" Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.562166 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd869ac-017f-41de-be03-cf9afde344e8" containerName="collect-profiles" Dec 16 15:30:21 crc kubenswrapper[4775]: E1216 15:30:21.562201 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45cea987-db5b-4cb3-944d-b242b75580b3" containerName="extract-content" Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.562207 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="45cea987-db5b-4cb3-944d-b242b75580b3" containerName="extract-content" Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.562527 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd869ac-017f-41de-be03-cf9afde344e8" containerName="collect-profiles" Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.562566 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="45cea987-db5b-4cb3-944d-b242b75580b3" containerName="registry-server" Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.563989 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjq89" Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.604410 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e8944cd-cabe-4317-b1d3-a706cc818a26-utilities\") pod \"certified-operators-wjq89\" (UID: \"1e8944cd-cabe-4317-b1d3-a706cc818a26\") " pod="openshift-marketplace/certified-operators-wjq89" Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.604474 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbgxx\" (UniqueName: \"kubernetes.io/projected/1e8944cd-cabe-4317-b1d3-a706cc818a26-kube-api-access-gbgxx\") pod \"certified-operators-wjq89\" (UID: \"1e8944cd-cabe-4317-b1d3-a706cc818a26\") " pod="openshift-marketplace/certified-operators-wjq89" Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.604569 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e8944cd-cabe-4317-b1d3-a706cc818a26-catalog-content\") pod \"certified-operators-wjq89\" (UID: \"1e8944cd-cabe-4317-b1d3-a706cc818a26\") " pod="openshift-marketplace/certified-operators-wjq89" Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.615641 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wjq89"] Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.705809 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e8944cd-cabe-4317-b1d3-a706cc818a26-catalog-content\") pod \"certified-operators-wjq89\" (UID: \"1e8944cd-cabe-4317-b1d3-a706cc818a26\") " pod="openshift-marketplace/certified-operators-wjq89" Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.706008 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e8944cd-cabe-4317-b1d3-a706cc818a26-utilities\") pod \"certified-operators-wjq89\" (UID: \"1e8944cd-cabe-4317-b1d3-a706cc818a26\") " pod="openshift-marketplace/certified-operators-wjq89" Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.706043 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbgxx\" (UniqueName: \"kubernetes.io/projected/1e8944cd-cabe-4317-b1d3-a706cc818a26-kube-api-access-gbgxx\") pod \"certified-operators-wjq89\" (UID: \"1e8944cd-cabe-4317-b1d3-a706cc818a26\") " pod="openshift-marketplace/certified-operators-wjq89" Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.706320 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e8944cd-cabe-4317-b1d3-a706cc818a26-catalog-content\") pod \"certified-operators-wjq89\" (UID: \"1e8944cd-cabe-4317-b1d3-a706cc818a26\") " pod="openshift-marketplace/certified-operators-wjq89" Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.706465 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e8944cd-cabe-4317-b1d3-a706cc818a26-utilities\") pod \"certified-operators-wjq89\" (UID: \"1e8944cd-cabe-4317-b1d3-a706cc818a26\") " pod="openshift-marketplace/certified-operators-wjq89" Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.738746 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbgxx\" (UniqueName: \"kubernetes.io/projected/1e8944cd-cabe-4317-b1d3-a706cc818a26-kube-api-access-gbgxx\") pod \"certified-operators-wjq89\" (UID: \"1e8944cd-cabe-4317-b1d3-a706cc818a26\") " pod="openshift-marketplace/certified-operators-wjq89" Dec 16 15:30:21 crc kubenswrapper[4775]: I1216 15:30:21.901045 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjq89" Dec 16 15:30:22 crc kubenswrapper[4775]: I1216 15:30:22.392211 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wjq89"] Dec 16 15:30:22 crc kubenswrapper[4775]: I1216 15:30:22.877242 4775 generic.go:334] "Generic (PLEG): container finished" podID="1e8944cd-cabe-4317-b1d3-a706cc818a26" containerID="938d7669fcefd1619e205e44cdd5422a75a4b7e2fe1259dc9fcdd977b4ef9fd0" exitCode=0 Dec 16 15:30:22 crc kubenswrapper[4775]: I1216 15:30:22.877296 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjq89" event={"ID":"1e8944cd-cabe-4317-b1d3-a706cc818a26","Type":"ContainerDied","Data":"938d7669fcefd1619e205e44cdd5422a75a4b7e2fe1259dc9fcdd977b4ef9fd0"} Dec 16 15:30:22 crc kubenswrapper[4775]: I1216 15:30:22.877522 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjq89" event={"ID":"1e8944cd-cabe-4317-b1d3-a706cc818a26","Type":"ContainerStarted","Data":"dcc78bc2c3855a4710f385c68e3cb69d6a69547dd471c68f05f2f6d7d47ca86d"} Dec 16 15:30:24 crc kubenswrapper[4775]: I1216 15:30:24.902415 4775 generic.go:334] "Generic (PLEG): container finished" podID="1e8944cd-cabe-4317-b1d3-a706cc818a26" containerID="1feae6f1961db4a9ea4e11766da3131a89cfca7e8023636d5247181636df9a66" exitCode=0 Dec 16 15:30:24 crc kubenswrapper[4775]: I1216 15:30:24.902937 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjq89" event={"ID":"1e8944cd-cabe-4317-b1d3-a706cc818a26","Type":"ContainerDied","Data":"1feae6f1961db4a9ea4e11766da3131a89cfca7e8023636d5247181636df9a66"} Dec 16 15:30:25 crc kubenswrapper[4775]: I1216 15:30:25.913270 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjq89" event={"ID":"1e8944cd-cabe-4317-b1d3-a706cc818a26","Type":"ContainerStarted","Data":"322817bdc41738997d3849337553ffb074df0ffc615602e0fbb444a6ea9181a7"} Dec 16 15:30:25 crc kubenswrapper[4775]: I1216 15:30:25.936590 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wjq89" podStartSLOduration=2.299353986 podStartE2EDuration="4.936573193s" podCreationTimestamp="2025-12-16 15:30:21 +0000 UTC" firstStartedPulling="2025-12-16 15:30:22.878936616 +0000 UTC m=+2147.830015539" lastFinishedPulling="2025-12-16 15:30:25.516155823 +0000 UTC m=+2150.467234746" observedRunningTime="2025-12-16 15:30:25.93108561 +0000 UTC m=+2150.882164553" watchObservedRunningTime="2025-12-16 15:30:25.936573193 +0000 UTC m=+2150.887652116" Dec 16 15:30:31 crc kubenswrapper[4775]: I1216 15:30:31.902514 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wjq89" Dec 16 15:30:31 crc kubenswrapper[4775]: I1216 15:30:31.903005 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wjq89" Dec 16 15:30:31 crc kubenswrapper[4775]: I1216 15:30:31.949217 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wjq89" Dec 16 15:30:32 crc kubenswrapper[4775]: I1216 15:30:32.023018 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wjq89" Dec 16 15:30:32 crc kubenswrapper[4775]: I1216 15:30:32.190468 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wjq89"] Dec 16 15:30:32 crc kubenswrapper[4775]: I1216 15:30:32.868854 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:30:32 crc kubenswrapper[4775]: I1216 15:30:32.868922 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:30:34 crc kubenswrapper[4775]: I1216 15:30:34.001807 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wjq89" podUID="1e8944cd-cabe-4317-b1d3-a706cc818a26" containerName="registry-server" containerID="cri-o://322817bdc41738997d3849337553ffb074df0ffc615602e0fbb444a6ea9181a7" gracePeriod=2 Dec 16 15:30:34 crc kubenswrapper[4775]: I1216 15:30:34.456532 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjq89" Dec 16 15:30:34 crc kubenswrapper[4775]: I1216 15:30:34.567939 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e8944cd-cabe-4317-b1d3-a706cc818a26-catalog-content\") pod \"1e8944cd-cabe-4317-b1d3-a706cc818a26\" (UID: \"1e8944cd-cabe-4317-b1d3-a706cc818a26\") " Dec 16 15:30:34 crc kubenswrapper[4775]: I1216 15:30:34.568415 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e8944cd-cabe-4317-b1d3-a706cc818a26-utilities\") pod \"1e8944cd-cabe-4317-b1d3-a706cc818a26\" (UID: \"1e8944cd-cabe-4317-b1d3-a706cc818a26\") " Dec 16 15:30:34 crc kubenswrapper[4775]: I1216 15:30:34.568618 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbgxx\" (UniqueName: \"kubernetes.io/projected/1e8944cd-cabe-4317-b1d3-a706cc818a26-kube-api-access-gbgxx\") pod \"1e8944cd-cabe-4317-b1d3-a706cc818a26\" (UID: \"1e8944cd-cabe-4317-b1d3-a706cc818a26\") " Dec 16 15:30:34 crc kubenswrapper[4775]: I1216 15:30:34.569209 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e8944cd-cabe-4317-b1d3-a706cc818a26-utilities" (OuterVolumeSpecName: "utilities") pod "1e8944cd-cabe-4317-b1d3-a706cc818a26" (UID: "1e8944cd-cabe-4317-b1d3-a706cc818a26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:30:34 crc kubenswrapper[4775]: I1216 15:30:34.574831 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e8944cd-cabe-4317-b1d3-a706cc818a26-kube-api-access-gbgxx" (OuterVolumeSpecName: "kube-api-access-gbgxx") pod "1e8944cd-cabe-4317-b1d3-a706cc818a26" (UID: "1e8944cd-cabe-4317-b1d3-a706cc818a26"). InnerVolumeSpecName "kube-api-access-gbgxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:30:34 crc kubenswrapper[4775]: I1216 15:30:34.670967 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbgxx\" (UniqueName: \"kubernetes.io/projected/1e8944cd-cabe-4317-b1d3-a706cc818a26-kube-api-access-gbgxx\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:34 crc kubenswrapper[4775]: I1216 15:30:34.671004 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e8944cd-cabe-4317-b1d3-a706cc818a26-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:34 crc kubenswrapper[4775]: I1216 15:30:34.941819 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e8944cd-cabe-4317-b1d3-a706cc818a26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e8944cd-cabe-4317-b1d3-a706cc818a26" (UID: "1e8944cd-cabe-4317-b1d3-a706cc818a26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:30:34 crc kubenswrapper[4775]: I1216 15:30:34.976044 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e8944cd-cabe-4317-b1d3-a706cc818a26-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:35 crc kubenswrapper[4775]: I1216 15:30:35.012503 4775 generic.go:334] "Generic (PLEG): container finished" podID="1e8944cd-cabe-4317-b1d3-a706cc818a26" containerID="322817bdc41738997d3849337553ffb074df0ffc615602e0fbb444a6ea9181a7" exitCode=0 Dec 16 15:30:35 crc kubenswrapper[4775]: I1216 15:30:35.012551 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjq89" event={"ID":"1e8944cd-cabe-4317-b1d3-a706cc818a26","Type":"ContainerDied","Data":"322817bdc41738997d3849337553ffb074df0ffc615602e0fbb444a6ea9181a7"} Dec 16 15:30:35 crc kubenswrapper[4775]: I1216 15:30:35.012566 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjq89" Dec 16 15:30:35 crc kubenswrapper[4775]: I1216 15:30:35.012583 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjq89" event={"ID":"1e8944cd-cabe-4317-b1d3-a706cc818a26","Type":"ContainerDied","Data":"dcc78bc2c3855a4710f385c68e3cb69d6a69547dd471c68f05f2f6d7d47ca86d"} Dec 16 15:30:35 crc kubenswrapper[4775]: I1216 15:30:35.012604 4775 scope.go:117] "RemoveContainer" containerID="322817bdc41738997d3849337553ffb074df0ffc615602e0fbb444a6ea9181a7" Dec 16 15:30:35 crc kubenswrapper[4775]: I1216 15:30:35.036545 4775 scope.go:117] "RemoveContainer" containerID="1feae6f1961db4a9ea4e11766da3131a89cfca7e8023636d5247181636df9a66" Dec 16 15:30:35 crc kubenswrapper[4775]: I1216 15:30:35.042864 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wjq89"] Dec 16 15:30:35 crc kubenswrapper[4775]: I1216 15:30:35.052772 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wjq89"] Dec 16 15:30:35 crc kubenswrapper[4775]: I1216 15:30:35.062013 4775 scope.go:117] "RemoveContainer" containerID="938d7669fcefd1619e205e44cdd5422a75a4b7e2fe1259dc9fcdd977b4ef9fd0" Dec 16 15:30:35 crc kubenswrapper[4775]: I1216 15:30:35.113462 4775 scope.go:117] "RemoveContainer" containerID="322817bdc41738997d3849337553ffb074df0ffc615602e0fbb444a6ea9181a7" Dec 16 15:30:35 crc kubenswrapper[4775]: E1216 15:30:35.113980 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"322817bdc41738997d3849337553ffb074df0ffc615602e0fbb444a6ea9181a7\": container with ID starting with 322817bdc41738997d3849337553ffb074df0ffc615602e0fbb444a6ea9181a7 not found: ID does not exist" containerID="322817bdc41738997d3849337553ffb074df0ffc615602e0fbb444a6ea9181a7" Dec 16 15:30:35 crc kubenswrapper[4775]: I1216 15:30:35.114010 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322817bdc41738997d3849337553ffb074df0ffc615602e0fbb444a6ea9181a7"} err="failed to get container status \"322817bdc41738997d3849337553ffb074df0ffc615602e0fbb444a6ea9181a7\": rpc error: code = NotFound desc = could not find container \"322817bdc41738997d3849337553ffb074df0ffc615602e0fbb444a6ea9181a7\": container with ID starting with 322817bdc41738997d3849337553ffb074df0ffc615602e0fbb444a6ea9181a7 not found: ID does not exist" Dec 16 15:30:35 crc kubenswrapper[4775]: I1216 15:30:35.114031 4775 scope.go:117] "RemoveContainer" containerID="1feae6f1961db4a9ea4e11766da3131a89cfca7e8023636d5247181636df9a66" Dec 16 15:30:35 crc kubenswrapper[4775]: E1216 15:30:35.114345 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1feae6f1961db4a9ea4e11766da3131a89cfca7e8023636d5247181636df9a66\": container with ID starting with 1feae6f1961db4a9ea4e11766da3131a89cfca7e8023636d5247181636df9a66 not found: ID does not exist" containerID="1feae6f1961db4a9ea4e11766da3131a89cfca7e8023636d5247181636df9a66" Dec 16 15:30:35 crc kubenswrapper[4775]: I1216 15:30:35.114377 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1feae6f1961db4a9ea4e11766da3131a89cfca7e8023636d5247181636df9a66"} err="failed to get container status \"1feae6f1961db4a9ea4e11766da3131a89cfca7e8023636d5247181636df9a66\": rpc error: code = NotFound desc = could not find container \"1feae6f1961db4a9ea4e11766da3131a89cfca7e8023636d5247181636df9a66\": container with ID starting with 1feae6f1961db4a9ea4e11766da3131a89cfca7e8023636d5247181636df9a66 not found: ID does not exist" Dec 16 15:30:35 crc kubenswrapper[4775]: I1216 15:30:35.114396 4775 scope.go:117] "RemoveContainer" containerID="938d7669fcefd1619e205e44cdd5422a75a4b7e2fe1259dc9fcdd977b4ef9fd0" Dec 16 15:30:35 crc kubenswrapper[4775]: E1216 15:30:35.114743 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"938d7669fcefd1619e205e44cdd5422a75a4b7e2fe1259dc9fcdd977b4ef9fd0\": container with ID starting with 938d7669fcefd1619e205e44cdd5422a75a4b7e2fe1259dc9fcdd977b4ef9fd0 not found: ID does not exist" containerID="938d7669fcefd1619e205e44cdd5422a75a4b7e2fe1259dc9fcdd977b4ef9fd0" Dec 16 15:30:35 crc kubenswrapper[4775]: I1216 15:30:35.114805 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"938d7669fcefd1619e205e44cdd5422a75a4b7e2fe1259dc9fcdd977b4ef9fd0"} err="failed to get container status \"938d7669fcefd1619e205e44cdd5422a75a4b7e2fe1259dc9fcdd977b4ef9fd0\": rpc error: code = NotFound desc = could not find container \"938d7669fcefd1619e205e44cdd5422a75a4b7e2fe1259dc9fcdd977b4ef9fd0\": container with ID starting with 938d7669fcefd1619e205e44cdd5422a75a4b7e2fe1259dc9fcdd977b4ef9fd0 not found: ID does not exist" Dec 16 15:30:35 crc kubenswrapper[4775]: I1216 15:30:35.354006 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e8944cd-cabe-4317-b1d3-a706cc818a26" path="/var/lib/kubelet/pods/1e8944cd-cabe-4317-b1d3-a706cc818a26/volumes" Dec 16 15:30:52 crc kubenswrapper[4775]: I1216 15:30:52.404338 4775 scope.go:117] "RemoveContainer" containerID="cd47c42773fde89848aef40302bc63d5eda4f961265c90cc888a03e584302595" Dec 16 15:31:02 crc kubenswrapper[4775]: I1216 15:31:02.869425 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:31:02 crc kubenswrapper[4775]: I1216 15:31:02.869976 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:31:02 crc kubenswrapper[4775]: I1216 15:31:02.870029 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 15:31:02 crc kubenswrapper[4775]: I1216 15:31:02.870847 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a69bb344960c504a6cfe9e1c8feab0a47fc248099223e066857479364957b64"} pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:31:02 crc kubenswrapper[4775]: I1216 15:31:02.870945 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" containerID="cri-o://4a69bb344960c504a6cfe9e1c8feab0a47fc248099223e066857479364957b64" gracePeriod=600 Dec 16 15:31:03 crc kubenswrapper[4775]: I1216 15:31:03.242923 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerDied","Data":"4a69bb344960c504a6cfe9e1c8feab0a47fc248099223e066857479364957b64"} Dec 16 15:31:03 crc kubenswrapper[4775]: I1216 15:31:03.242970 4775 generic.go:334] "Generic (PLEG): container finished" podID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerID="4a69bb344960c504a6cfe9e1c8feab0a47fc248099223e066857479364957b64" exitCode=0 Dec 16 15:31:03 crc kubenswrapper[4775]: I1216 15:31:03.243278 4775 scope.go:117] "RemoveContainer" containerID="0b52e55b20c84ed5990833cb147db11decc174ad00b1ab3079efce9d20b16fda" Dec 16 15:31:03 crc kubenswrapper[4775]: I1216 15:31:03.243290 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerStarted","Data":"cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26"} Dec 16 15:31:10 crc kubenswrapper[4775]: I1216 15:31:10.299083 4775 generic.go:334] "Generic (PLEG): container finished" podID="0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36" containerID="ebd2a4fa157ab230754a5db7128f86e015b7f331883eaed846273291e66de288" exitCode=0 Dec 16 15:31:10 crc kubenswrapper[4775]: I1216 15:31:10.299140 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" event={"ID":"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36","Type":"ContainerDied","Data":"ebd2a4fa157ab230754a5db7128f86e015b7f331883eaed846273291e66de288"} Dec 16 15:31:11 crc kubenswrapper[4775]: I1216 15:31:11.762519 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:31:11 crc kubenswrapper[4775]: I1216 15:31:11.878675 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4drf\" (UniqueName: \"kubernetes.io/projected/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-kube-api-access-v4drf\") pod \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " Dec 16 15:31:11 crc kubenswrapper[4775]: I1216 15:31:11.878806 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-inventory\") pod \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " Dec 16 15:31:11 crc kubenswrapper[4775]: I1216 15:31:11.878965 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-ovn-combined-ca-bundle\") pod \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " Dec 16 15:31:11 crc kubenswrapper[4775]: I1216 15:31:11.879013 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-ovncontroller-config-0\") pod \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " Dec 16 15:31:11 crc kubenswrapper[4775]: I1216 15:31:11.879101 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-ssh-key\") pod \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\" (UID: \"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36\") " Dec 16 15:31:11 crc kubenswrapper[4775]: I1216 15:31:11.885330 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36" (UID: "0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:31:11 crc kubenswrapper[4775]: I1216 15:31:11.887215 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-kube-api-access-v4drf" (OuterVolumeSpecName: "kube-api-access-v4drf") pod "0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36" (UID: "0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36"). InnerVolumeSpecName "kube-api-access-v4drf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:31:11 crc kubenswrapper[4775]: I1216 15:31:11.910284 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36" (UID: "0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:31:11 crc kubenswrapper[4775]: I1216 15:31:11.912011 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36" (UID: "0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:31:11 crc kubenswrapper[4775]: I1216 15:31:11.914106 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-inventory" (OuterVolumeSpecName: "inventory") pod "0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36" (UID: "0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:31:11 crc kubenswrapper[4775]: I1216 15:31:11.980882 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:31:11 crc kubenswrapper[4775]: I1216 15:31:11.980920 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:31:11 crc kubenswrapper[4775]: I1216 15:31:11.980931 4775 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:31:11 crc kubenswrapper[4775]: I1216 15:31:11.980939 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:31:11 crc kubenswrapper[4775]: I1216 15:31:11.980947 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4drf\" (UniqueName: \"kubernetes.io/projected/0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36-kube-api-access-v4drf\") on node \"crc\" DevicePath \"\"" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.316361 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" event={"ID":"0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36","Type":"ContainerDied","Data":"d4de0f01042bfead2991fbbeb40a1a9751d0b1268fc219b178ed797945492953"} Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.316745 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4de0f01042bfead2991fbbeb40a1a9751d0b1268fc219b178ed797945492953" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.316436 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fdz4b" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.453795 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh"] Dec 16 15:31:12 crc kubenswrapper[4775]: E1216 15:31:12.454267 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.454312 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 16 15:31:12 crc kubenswrapper[4775]: E1216 15:31:12.454332 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8944cd-cabe-4317-b1d3-a706cc818a26" containerName="extract-content" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.454343 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8944cd-cabe-4317-b1d3-a706cc818a26" containerName="extract-content" Dec 16 15:31:12 crc kubenswrapper[4775]: E1216 15:31:12.454374 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8944cd-cabe-4317-b1d3-a706cc818a26" containerName="extract-utilities" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.454383 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8944cd-cabe-4317-b1d3-a706cc818a26" containerName="extract-utilities" Dec 16 15:31:12 crc kubenswrapper[4775]: E1216 15:31:12.454400 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8944cd-cabe-4317-b1d3-a706cc818a26" containerName="registry-server" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.454408 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8944cd-cabe-4317-b1d3-a706cc818a26" containerName="registry-server" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.454651 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.454670 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e8944cd-cabe-4317-b1d3-a706cc818a26" containerName="registry-server" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.455436 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.462585 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.462824 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.463339 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.463502 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.463737 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.466013 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh"] Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.466411 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tgv5f" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.595164 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.595342 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.595383 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.595432 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdfmc\" (UniqueName: \"kubernetes.io/projected/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-kube-api-access-vdfmc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.595517 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.595656 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.697809 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.698012 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.698074 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.698109 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.698147 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdfmc\" (UniqueName: \"kubernetes.io/projected/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-kube-api-access-vdfmc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.698192 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.706416 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.707472 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.708102 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.708353 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.713346 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.717155 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdfmc\" (UniqueName: \"kubernetes.io/projected/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-kube-api-access-vdfmc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:12 crc kubenswrapper[4775]: I1216 15:31:12.772678 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:31:13 crc kubenswrapper[4775]: I1216 15:31:13.299710 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh"] Dec 16 15:31:13 crc kubenswrapper[4775]: I1216 15:31:13.324130 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" event={"ID":"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e","Type":"ContainerStarted","Data":"28b15592be81b550b24e1ea7853e53311d976a38ce17baddf51ae15498f84c61"} Dec 16 15:31:14 crc kubenswrapper[4775]: I1216 15:31:14.336634 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" event={"ID":"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e","Type":"ContainerStarted","Data":"40620dc520efecae5eb2ae75aac03f7a3c06f0651559435779190c842a64c92e"} Dec 16 15:31:14 crc kubenswrapper[4775]: I1216 15:31:14.361125 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" podStartSLOduration=1.752433007 podStartE2EDuration="2.361101404s" podCreationTimestamp="2025-12-16 15:31:12 +0000 UTC" firstStartedPulling="2025-12-16 15:31:13.303146989 +0000 UTC m=+2198.254225912" lastFinishedPulling="2025-12-16 15:31:13.911815386 +0000 UTC m=+2198.862894309" observedRunningTime="2025-12-16 15:31:14.350612204 +0000 UTC m=+2199.301691167" watchObservedRunningTime="2025-12-16 15:31:14.361101404 +0000 UTC m=+2199.312180337" Dec 16 15:31:21 crc kubenswrapper[4775]: I1216 15:31:21.939295 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kj755"] Dec 16 15:31:21 crc kubenswrapper[4775]: I1216 15:31:21.943497 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kj755" Dec 16 15:31:21 crc kubenswrapper[4775]: I1216 15:31:21.954015 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kj755"] Dec 16 15:31:22 crc kubenswrapper[4775]: I1216 15:31:22.063854 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24da6229-bb33-41b3-bb6d-c494758be231-catalog-content\") pod \"community-operators-kj755\" (UID: \"24da6229-bb33-41b3-bb6d-c494758be231\") " pod="openshift-marketplace/community-operators-kj755" Dec 16 15:31:22 crc kubenswrapper[4775]: I1216 15:31:22.063931 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24da6229-bb33-41b3-bb6d-c494758be231-utilities\") pod \"community-operators-kj755\" (UID: \"24da6229-bb33-41b3-bb6d-c494758be231\") " pod="openshift-marketplace/community-operators-kj755" Dec 16 15:31:22 crc kubenswrapper[4775]: I1216 15:31:22.064194 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvv4x\" (UniqueName: \"kubernetes.io/projected/24da6229-bb33-41b3-bb6d-c494758be231-kube-api-access-fvv4x\") pod \"community-operators-kj755\" (UID: \"24da6229-bb33-41b3-bb6d-c494758be231\") " pod="openshift-marketplace/community-operators-kj755" Dec 16 15:31:22 crc kubenswrapper[4775]: I1216 15:31:22.167146 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvv4x\" (UniqueName: \"kubernetes.io/projected/24da6229-bb33-41b3-bb6d-c494758be231-kube-api-access-fvv4x\") pod \"community-operators-kj755\" (UID: \"24da6229-bb33-41b3-bb6d-c494758be231\") " pod="openshift-marketplace/community-operators-kj755" Dec 16 15:31:22 crc kubenswrapper[4775]: I1216 15:31:22.167479 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24da6229-bb33-41b3-bb6d-c494758be231-catalog-content\") pod \"community-operators-kj755\" (UID: \"24da6229-bb33-41b3-bb6d-c494758be231\") " pod="openshift-marketplace/community-operators-kj755" Dec 16 15:31:22 crc kubenswrapper[4775]: I1216 15:31:22.167595 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24da6229-bb33-41b3-bb6d-c494758be231-utilities\") pod \"community-operators-kj755\" (UID: \"24da6229-bb33-41b3-bb6d-c494758be231\") " pod="openshift-marketplace/community-operators-kj755" Dec 16 15:31:22 crc kubenswrapper[4775]: I1216 15:31:22.168213 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24da6229-bb33-41b3-bb6d-c494758be231-utilities\") pod \"community-operators-kj755\" (UID: \"24da6229-bb33-41b3-bb6d-c494758be231\") " pod="openshift-marketplace/community-operators-kj755" Dec 16 15:31:22 crc kubenswrapper[4775]: I1216 15:31:22.168764 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24da6229-bb33-41b3-bb6d-c494758be231-catalog-content\") pod \"community-operators-kj755\" (UID: \"24da6229-bb33-41b3-bb6d-c494758be231\") " pod="openshift-marketplace/community-operators-kj755" Dec 16 15:31:22 crc kubenswrapper[4775]: I1216 15:31:22.198906 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvv4x\" (UniqueName: \"kubernetes.io/projected/24da6229-bb33-41b3-bb6d-c494758be231-kube-api-access-fvv4x\") pod \"community-operators-kj755\" (UID: \"24da6229-bb33-41b3-bb6d-c494758be231\") " pod="openshift-marketplace/community-operators-kj755" Dec 16 15:31:22 crc kubenswrapper[4775]: I1216 15:31:22.281042 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kj755" Dec 16 15:31:22 crc kubenswrapper[4775]: I1216 15:31:22.836354 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kj755"] Dec 16 15:31:23 crc kubenswrapper[4775]: I1216 15:31:23.408876 4775 generic.go:334] "Generic (PLEG): container finished" podID="24da6229-bb33-41b3-bb6d-c494758be231" containerID="35c9cea76f28262ae5fe101967c68606da3ad73c4de5639b41ad27ef342855d6" exitCode=0 Dec 16 15:31:23 crc kubenswrapper[4775]: I1216 15:31:23.408934 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj755" event={"ID":"24da6229-bb33-41b3-bb6d-c494758be231","Type":"ContainerDied","Data":"35c9cea76f28262ae5fe101967c68606da3ad73c4de5639b41ad27ef342855d6"} Dec 16 15:31:23 crc kubenswrapper[4775]: I1216 15:31:23.409204 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj755" event={"ID":"24da6229-bb33-41b3-bb6d-c494758be231","Type":"ContainerStarted","Data":"461b664ad0a5ad4cec2daf3bc3b88e8ec1b19ae06729eeaa553ea1bbab3fa97e"} Dec 16 15:31:24 crc kubenswrapper[4775]: I1216 15:31:24.431755 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj755" event={"ID":"24da6229-bb33-41b3-bb6d-c494758be231","Type":"ContainerStarted","Data":"3777c6e54a1eba9b69779ca11d7c4ffa7d1750b6593cc3d8bfd0758a5db623c0"} Dec 16 15:31:25 crc kubenswrapper[4775]: I1216 15:31:25.443001 4775 generic.go:334] "Generic (PLEG): container finished" podID="24da6229-bb33-41b3-bb6d-c494758be231" containerID="3777c6e54a1eba9b69779ca11d7c4ffa7d1750b6593cc3d8bfd0758a5db623c0" exitCode=0 Dec 16 15:31:25 crc kubenswrapper[4775]: I1216 15:31:25.443312 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj755" event={"ID":"24da6229-bb33-41b3-bb6d-c494758be231","Type":"ContainerDied","Data":"3777c6e54a1eba9b69779ca11d7c4ffa7d1750b6593cc3d8bfd0758a5db623c0"} Dec 16 15:31:26 crc kubenswrapper[4775]: I1216 15:31:26.455644 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj755" event={"ID":"24da6229-bb33-41b3-bb6d-c494758be231","Type":"ContainerStarted","Data":"044e5011767182bc9d22169d98fad50aeb0e132998559f5496c23654a14cb358"} Dec 16 15:31:26 crc kubenswrapper[4775]: I1216 15:31:26.475991 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kj755" podStartSLOduration=2.973857806 podStartE2EDuration="5.475974459s" podCreationTimestamp="2025-12-16 15:31:21 +0000 UTC" firstStartedPulling="2025-12-16 15:31:23.410790204 +0000 UTC m=+2208.361869147" lastFinishedPulling="2025-12-16 15:31:25.912906877 +0000 UTC m=+2210.863985800" observedRunningTime="2025-12-16 15:31:26.473683737 +0000 UTC m=+2211.424762670" watchObservedRunningTime="2025-12-16 15:31:26.475974459 +0000 UTC m=+2211.427053382" Dec 16 15:31:32 crc kubenswrapper[4775]: I1216 15:31:32.281219 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kj755" Dec 16 15:31:32 crc kubenswrapper[4775]: I1216 15:31:32.281754 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kj755" Dec 16 15:31:32 crc kubenswrapper[4775]: I1216 15:31:32.322620 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kj755" Dec 16 15:31:32 crc kubenswrapper[4775]: I1216 15:31:32.567044 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kj755" Dec 16 15:31:32 crc kubenswrapper[4775]: I1216 15:31:32.617793 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kj755"] Dec 16 15:31:34 crc kubenswrapper[4775]: I1216 15:31:34.528244 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kj755" podUID="24da6229-bb33-41b3-bb6d-c494758be231" containerName="registry-server" containerID="cri-o://044e5011767182bc9d22169d98fad50aeb0e132998559f5496c23654a14cb358" gracePeriod=2 Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.044105 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kj755" Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.131536 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24da6229-bb33-41b3-bb6d-c494758be231-utilities\") pod \"24da6229-bb33-41b3-bb6d-c494758be231\" (UID: \"24da6229-bb33-41b3-bb6d-c494758be231\") " Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.131642 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24da6229-bb33-41b3-bb6d-c494758be231-catalog-content\") pod \"24da6229-bb33-41b3-bb6d-c494758be231\" (UID: \"24da6229-bb33-41b3-bb6d-c494758be231\") " Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.131812 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvv4x\" (UniqueName: \"kubernetes.io/projected/24da6229-bb33-41b3-bb6d-c494758be231-kube-api-access-fvv4x\") pod \"24da6229-bb33-41b3-bb6d-c494758be231\" (UID: \"24da6229-bb33-41b3-bb6d-c494758be231\") " Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.132440 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24da6229-bb33-41b3-bb6d-c494758be231-utilities" (OuterVolumeSpecName: "utilities") pod "24da6229-bb33-41b3-bb6d-c494758be231" (UID: "24da6229-bb33-41b3-bb6d-c494758be231"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.141110 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24da6229-bb33-41b3-bb6d-c494758be231-kube-api-access-fvv4x" (OuterVolumeSpecName: "kube-api-access-fvv4x") pod "24da6229-bb33-41b3-bb6d-c494758be231" (UID: "24da6229-bb33-41b3-bb6d-c494758be231"). InnerVolumeSpecName "kube-api-access-fvv4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.184767 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24da6229-bb33-41b3-bb6d-c494758be231-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24da6229-bb33-41b3-bb6d-c494758be231" (UID: "24da6229-bb33-41b3-bb6d-c494758be231"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.234129 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvv4x\" (UniqueName: \"kubernetes.io/projected/24da6229-bb33-41b3-bb6d-c494758be231-kube-api-access-fvv4x\") on node \"crc\" DevicePath \"\"" Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.234164 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24da6229-bb33-41b3-bb6d-c494758be231-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.234178 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24da6229-bb33-41b3-bb6d-c494758be231-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.537410 4775 generic.go:334] "Generic (PLEG): container finished" podID="24da6229-bb33-41b3-bb6d-c494758be231" containerID="044e5011767182bc9d22169d98fad50aeb0e132998559f5496c23654a14cb358" exitCode=0 Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.537467 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj755" event={"ID":"24da6229-bb33-41b3-bb6d-c494758be231","Type":"ContainerDied","Data":"044e5011767182bc9d22169d98fad50aeb0e132998559f5496c23654a14cb358"} Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.537542 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kj755" Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.537797 4775 scope.go:117] "RemoveContainer" containerID="044e5011767182bc9d22169d98fad50aeb0e132998559f5496c23654a14cb358" Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.537784 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj755" event={"ID":"24da6229-bb33-41b3-bb6d-c494758be231","Type":"ContainerDied","Data":"461b664ad0a5ad4cec2daf3bc3b88e8ec1b19ae06729eeaa553ea1bbab3fa97e"} Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.564500 4775 scope.go:117] "RemoveContainer" containerID="3777c6e54a1eba9b69779ca11d7c4ffa7d1750b6593cc3d8bfd0758a5db623c0" Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.568991 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kj755"] Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.581450 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kj755"] Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.584942 4775 scope.go:117] "RemoveContainer" containerID="35c9cea76f28262ae5fe101967c68606da3ad73c4de5639b41ad27ef342855d6" Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.649938 4775 scope.go:117] "RemoveContainer" containerID="044e5011767182bc9d22169d98fad50aeb0e132998559f5496c23654a14cb358" Dec 16 15:31:35 crc kubenswrapper[4775]: E1216 15:31:35.650488 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"044e5011767182bc9d22169d98fad50aeb0e132998559f5496c23654a14cb358\": container with ID starting with 044e5011767182bc9d22169d98fad50aeb0e132998559f5496c23654a14cb358 not found: ID does not exist" containerID="044e5011767182bc9d22169d98fad50aeb0e132998559f5496c23654a14cb358" Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.650529 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"044e5011767182bc9d22169d98fad50aeb0e132998559f5496c23654a14cb358"} err="failed to get container status \"044e5011767182bc9d22169d98fad50aeb0e132998559f5496c23654a14cb358\": rpc error: code = NotFound desc = could not find container \"044e5011767182bc9d22169d98fad50aeb0e132998559f5496c23654a14cb358\": container with ID starting with 044e5011767182bc9d22169d98fad50aeb0e132998559f5496c23654a14cb358 not found: ID does not exist" Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.650549 4775 scope.go:117] "RemoveContainer" containerID="3777c6e54a1eba9b69779ca11d7c4ffa7d1750b6593cc3d8bfd0758a5db623c0" Dec 16 15:31:35 crc kubenswrapper[4775]: E1216 15:31:35.650810 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3777c6e54a1eba9b69779ca11d7c4ffa7d1750b6593cc3d8bfd0758a5db623c0\": container with ID starting with 3777c6e54a1eba9b69779ca11d7c4ffa7d1750b6593cc3d8bfd0758a5db623c0 not found: ID does not exist" containerID="3777c6e54a1eba9b69779ca11d7c4ffa7d1750b6593cc3d8bfd0758a5db623c0" Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.650845 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3777c6e54a1eba9b69779ca11d7c4ffa7d1750b6593cc3d8bfd0758a5db623c0"} err="failed to get container status \"3777c6e54a1eba9b69779ca11d7c4ffa7d1750b6593cc3d8bfd0758a5db623c0\": rpc error: code = NotFound desc = could not find container \"3777c6e54a1eba9b69779ca11d7c4ffa7d1750b6593cc3d8bfd0758a5db623c0\": container with ID starting with 3777c6e54a1eba9b69779ca11d7c4ffa7d1750b6593cc3d8bfd0758a5db623c0 not found: ID does not exist" Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.650865 4775 scope.go:117] "RemoveContainer" containerID="35c9cea76f28262ae5fe101967c68606da3ad73c4de5639b41ad27ef342855d6" Dec 16 15:31:35 crc kubenswrapper[4775]: E1216 15:31:35.651401 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35c9cea76f28262ae5fe101967c68606da3ad73c4de5639b41ad27ef342855d6\": container with ID starting with 35c9cea76f28262ae5fe101967c68606da3ad73c4de5639b41ad27ef342855d6 not found: ID does not exist" containerID="35c9cea76f28262ae5fe101967c68606da3ad73c4de5639b41ad27ef342855d6" Dec 16 15:31:35 crc kubenswrapper[4775]: I1216 15:31:35.651426 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35c9cea76f28262ae5fe101967c68606da3ad73c4de5639b41ad27ef342855d6"} err="failed to get container status \"35c9cea76f28262ae5fe101967c68606da3ad73c4de5639b41ad27ef342855d6\": rpc error: code = NotFound desc = could not find container \"35c9cea76f28262ae5fe101967c68606da3ad73c4de5639b41ad27ef342855d6\": container with ID starting with 35c9cea76f28262ae5fe101967c68606da3ad73c4de5639b41ad27ef342855d6 not found: ID does not exist" Dec 16 15:31:37 crc kubenswrapper[4775]: I1216 15:31:37.348711 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24da6229-bb33-41b3-bb6d-c494758be231" path="/var/lib/kubelet/pods/24da6229-bb33-41b3-bb6d-c494758be231/volumes" Dec 16 15:32:04 crc kubenswrapper[4775]: I1216 15:32:04.794452 4775 generic.go:334] "Generic (PLEG): container finished" podID="9a992ef8-ad46-4e3a-a98a-dc75ad484c7e" containerID="40620dc520efecae5eb2ae75aac03f7a3c06f0651559435779190c842a64c92e" exitCode=0 Dec 16 15:32:04 crc kubenswrapper[4775]: I1216 15:32:04.794563 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" event={"ID":"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e","Type":"ContainerDied","Data":"40620dc520efecae5eb2ae75aac03f7a3c06f0651559435779190c842a64c92e"} Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.346674 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.525119 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-neutron-metadata-combined-ca-bundle\") pod \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.525201 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.525275 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-nova-metadata-neutron-config-0\") pod \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.525419 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-inventory\") pod \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.525552 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-ssh-key\") pod \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.525574 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdfmc\" (UniqueName: \"kubernetes.io/projected/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-kube-api-access-vdfmc\") pod \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\" (UID: \"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e\") " Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.538675 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-kube-api-access-vdfmc" (OuterVolumeSpecName: "kube-api-access-vdfmc") pod "9a992ef8-ad46-4e3a-a98a-dc75ad484c7e" (UID: "9a992ef8-ad46-4e3a-a98a-dc75ad484c7e"). InnerVolumeSpecName "kube-api-access-vdfmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.538862 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9a992ef8-ad46-4e3a-a98a-dc75ad484c7e" (UID: "9a992ef8-ad46-4e3a-a98a-dc75ad484c7e"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.553867 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9a992ef8-ad46-4e3a-a98a-dc75ad484c7e" (UID: "9a992ef8-ad46-4e3a-a98a-dc75ad484c7e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.556428 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9a992ef8-ad46-4e3a-a98a-dc75ad484c7e" (UID: "9a992ef8-ad46-4e3a-a98a-dc75ad484c7e"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.556833 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9a992ef8-ad46-4e3a-a98a-dc75ad484c7e" (UID: "9a992ef8-ad46-4e3a-a98a-dc75ad484c7e"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.558438 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-inventory" (OuterVolumeSpecName: "inventory") pod "9a992ef8-ad46-4e3a-a98a-dc75ad484c7e" (UID: "9a992ef8-ad46-4e3a-a98a-dc75ad484c7e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.627449 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.627495 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdfmc\" (UniqueName: \"kubernetes.io/projected/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-kube-api-access-vdfmc\") on node \"crc\" DevicePath \"\"" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.627509 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.627521 4775 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.627534 4775 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.627547 4775 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a992ef8-ad46-4e3a-a98a-dc75ad484c7e-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.815627 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" event={"ID":"9a992ef8-ad46-4e3a-a98a-dc75ad484c7e","Type":"ContainerDied","Data":"28b15592be81b550b24e1ea7853e53311d976a38ce17baddf51ae15498f84c61"} Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.815669 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28b15592be81b550b24e1ea7853e53311d976a38ce17baddf51ae15498f84c61" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.815693 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.925689 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9"] Dec 16 15:32:06 crc kubenswrapper[4775]: E1216 15:32:06.926997 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24da6229-bb33-41b3-bb6d-c494758be231" containerName="registry-server" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.927018 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="24da6229-bb33-41b3-bb6d-c494758be231" containerName="registry-server" Dec 16 15:32:06 crc kubenswrapper[4775]: E1216 15:32:06.927040 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24da6229-bb33-41b3-bb6d-c494758be231" containerName="extract-content" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.927046 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="24da6229-bb33-41b3-bb6d-c494758be231" containerName="extract-content" Dec 16 15:32:06 crc kubenswrapper[4775]: E1216 15:32:06.927064 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a992ef8-ad46-4e3a-a98a-dc75ad484c7e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.927071 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a992ef8-ad46-4e3a-a98a-dc75ad484c7e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 16 15:32:06 crc kubenswrapper[4775]: E1216 15:32:06.927098 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24da6229-bb33-41b3-bb6d-c494758be231" containerName="extract-utilities" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.927104 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="24da6229-bb33-41b3-bb6d-c494758be231" containerName="extract-utilities" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.927281 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="24da6229-bb33-41b3-bb6d-c494758be231" containerName="registry-server" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.927302 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a992ef8-ad46-4e3a-a98a-dc75ad484c7e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.927979 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.933506 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9"] Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.935316 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.935324 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tgv5f" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.935459 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.935320 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:32:06 crc kubenswrapper[4775]: I1216 15:32:06.935321 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 16 15:32:07 crc kubenswrapper[4775]: I1216 15:32:07.034095 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:32:07 crc kubenswrapper[4775]: I1216 15:32:07.034158 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:32:07 crc kubenswrapper[4775]: I1216 15:32:07.034185 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpgl9\" (UniqueName: \"kubernetes.io/projected/0570786e-5fec-43cf-b7ec-12a4facea06d-kube-api-access-mpgl9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:32:07 crc kubenswrapper[4775]: I1216 15:32:07.034210 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:32:07 crc kubenswrapper[4775]: I1216 15:32:07.034557 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:32:07 crc kubenswrapper[4775]: I1216 15:32:07.136721 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:32:07 crc kubenswrapper[4775]: I1216 15:32:07.136776 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:32:07 crc kubenswrapper[4775]: I1216 15:32:07.136808 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpgl9\" (UniqueName: \"kubernetes.io/projected/0570786e-5fec-43cf-b7ec-12a4facea06d-kube-api-access-mpgl9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:32:07 crc kubenswrapper[4775]: I1216 15:32:07.136847 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:32:07 crc kubenswrapper[4775]: I1216 15:32:07.136996 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:32:07 crc kubenswrapper[4775]: I1216 15:32:07.143613 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:32:07 crc kubenswrapper[4775]: I1216 15:32:07.143615 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:32:07 crc kubenswrapper[4775]: I1216 15:32:07.143731 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:32:07 crc kubenswrapper[4775]: I1216 15:32:07.144197 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:32:07 crc kubenswrapper[4775]: I1216 15:32:07.156323 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpgl9\" (UniqueName: \"kubernetes.io/projected/0570786e-5fec-43cf-b7ec-12a4facea06d-kube-api-access-mpgl9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:32:07 crc kubenswrapper[4775]: I1216 15:32:07.246876 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:32:07 crc kubenswrapper[4775]: I1216 15:32:07.755838 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9"] Dec 16 15:32:07 crc kubenswrapper[4775]: I1216 15:32:07.825610 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" event={"ID":"0570786e-5fec-43cf-b7ec-12a4facea06d","Type":"ContainerStarted","Data":"2a8e14d88492e5a172f16d74408d550a1a585ab9cbd29669e25e26d64ca6c932"} Dec 16 15:32:08 crc kubenswrapper[4775]: I1216 15:32:08.837327 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" event={"ID":"0570786e-5fec-43cf-b7ec-12a4facea06d","Type":"ContainerStarted","Data":"504a565d30e68cfd4acae351183268c195b9bc127842b0833d1b2400dc7db6c5"} Dec 16 15:32:08 crc kubenswrapper[4775]: I1216 15:32:08.858038 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" podStartSLOduration=2.354673178 podStartE2EDuration="2.858012388s" podCreationTimestamp="2025-12-16 15:32:06 +0000 UTC" firstStartedPulling="2025-12-16 15:32:07.757242174 +0000 UTC m=+2252.708321107" lastFinishedPulling="2025-12-16 15:32:08.260581394 +0000 UTC m=+2253.211660317" observedRunningTime="2025-12-16 15:32:08.854321211 +0000 UTC m=+2253.805400134" watchObservedRunningTime="2025-12-16 15:32:08.858012388 +0000 UTC m=+2253.809091341" Dec 16 15:33:32 crc kubenswrapper[4775]: I1216 15:33:32.868854 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:33:32 crc kubenswrapper[4775]: I1216 15:33:32.871317 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:34:02 crc kubenswrapper[4775]: I1216 15:34:02.869491 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:34:02 crc kubenswrapper[4775]: I1216 15:34:02.870088 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:34:32 crc kubenswrapper[4775]: I1216 15:34:32.869303 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:34:32 crc kubenswrapper[4775]: I1216 15:34:32.869973 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:34:32 crc kubenswrapper[4775]: I1216 15:34:32.870029 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 15:34:32 crc kubenswrapper[4775]: I1216 15:34:32.870878 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26"} pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:34:32 crc kubenswrapper[4775]: I1216 15:34:32.870967 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" containerID="cri-o://cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" gracePeriod=600 Dec 16 15:34:33 crc kubenswrapper[4775]: E1216 15:34:33.004854 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:34:33 crc kubenswrapper[4775]: I1216 15:34:33.068046 4775 generic.go:334] "Generic (PLEG): container finished" podID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" exitCode=0 Dec 16 15:34:33 crc kubenswrapper[4775]: I1216 15:34:33.068109 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerDied","Data":"cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26"} Dec 16 15:34:33 crc kubenswrapper[4775]: I1216 15:34:33.068442 4775 scope.go:117] "RemoveContainer" containerID="4a69bb344960c504a6cfe9e1c8feab0a47fc248099223e066857479364957b64" Dec 16 15:34:33 crc kubenswrapper[4775]: I1216 15:34:33.069110 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:34:33 crc kubenswrapper[4775]: E1216 15:34:33.069382 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:34:46 crc kubenswrapper[4775]: I1216 15:34:46.338295 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:34:46 crc kubenswrapper[4775]: E1216 15:34:46.339103 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:34:57 crc kubenswrapper[4775]: I1216 15:34:57.338310 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:34:57 crc kubenswrapper[4775]: E1216 15:34:57.339034 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:35:08 crc kubenswrapper[4775]: I1216 15:35:08.337929 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:35:08 crc kubenswrapper[4775]: E1216 15:35:08.338751 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:35:23 crc kubenswrapper[4775]: I1216 15:35:23.338178 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:35:23 crc kubenswrapper[4775]: E1216 15:35:23.339719 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:35:34 crc kubenswrapper[4775]: I1216 15:35:34.338468 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:35:34 crc kubenswrapper[4775]: E1216 15:35:34.339677 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:35:49 crc kubenswrapper[4775]: I1216 15:35:49.338324 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:35:49 crc kubenswrapper[4775]: E1216 15:35:49.339195 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:36:03 crc kubenswrapper[4775]: I1216 15:36:03.337654 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:36:03 crc kubenswrapper[4775]: E1216 15:36:03.338427 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:36:14 crc kubenswrapper[4775]: I1216 15:36:14.338332 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:36:14 crc kubenswrapper[4775]: E1216 15:36:14.339167 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:36:14 crc kubenswrapper[4775]: I1216 15:36:14.917722 4775 generic.go:334] "Generic (PLEG): container finished" podID="0570786e-5fec-43cf-b7ec-12a4facea06d" containerID="504a565d30e68cfd4acae351183268c195b9bc127842b0833d1b2400dc7db6c5" exitCode=0 Dec 16 15:36:14 crc kubenswrapper[4775]: I1216 15:36:14.917798 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" event={"ID":"0570786e-5fec-43cf-b7ec-12a4facea06d","Type":"ContainerDied","Data":"504a565d30e68cfd4acae351183268c195b9bc127842b0833d1b2400dc7db6c5"} Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.344764 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.469864 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpgl9\" (UniqueName: \"kubernetes.io/projected/0570786e-5fec-43cf-b7ec-12a4facea06d-kube-api-access-mpgl9\") pod \"0570786e-5fec-43cf-b7ec-12a4facea06d\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.470288 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-inventory\") pod \"0570786e-5fec-43cf-b7ec-12a4facea06d\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.470335 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-libvirt-secret-0\") pod \"0570786e-5fec-43cf-b7ec-12a4facea06d\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.470435 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-libvirt-combined-ca-bundle\") pod \"0570786e-5fec-43cf-b7ec-12a4facea06d\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.472000 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-ssh-key\") pod \"0570786e-5fec-43cf-b7ec-12a4facea06d\" (UID: \"0570786e-5fec-43cf-b7ec-12a4facea06d\") " Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.477087 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0570786e-5fec-43cf-b7ec-12a4facea06d" (UID: "0570786e-5fec-43cf-b7ec-12a4facea06d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.477118 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0570786e-5fec-43cf-b7ec-12a4facea06d-kube-api-access-mpgl9" (OuterVolumeSpecName: "kube-api-access-mpgl9") pod "0570786e-5fec-43cf-b7ec-12a4facea06d" (UID: "0570786e-5fec-43cf-b7ec-12a4facea06d"). InnerVolumeSpecName "kube-api-access-mpgl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.508261 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-inventory" (OuterVolumeSpecName: "inventory") pod "0570786e-5fec-43cf-b7ec-12a4facea06d" (UID: "0570786e-5fec-43cf-b7ec-12a4facea06d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.517037 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0570786e-5fec-43cf-b7ec-12a4facea06d" (UID: "0570786e-5fec-43cf-b7ec-12a4facea06d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.521553 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "0570786e-5fec-43cf-b7ec-12a4facea06d" (UID: "0570786e-5fec-43cf-b7ec-12a4facea06d"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.575181 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.575224 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpgl9\" (UniqueName: \"kubernetes.io/projected/0570786e-5fec-43cf-b7ec-12a4facea06d-kube-api-access-mpgl9\") on node \"crc\" DevicePath \"\"" Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.575241 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.575256 4775 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.575269 4775 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0570786e-5fec-43cf-b7ec-12a4facea06d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.935382 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" event={"ID":"0570786e-5fec-43cf-b7ec-12a4facea06d","Type":"ContainerDied","Data":"2a8e14d88492e5a172f16d74408d550a1a585ab9cbd29669e25e26d64ca6c932"} Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.935430 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a8e14d88492e5a172f16d74408d550a1a585ab9cbd29669e25e26d64ca6c932" Dec 16 15:36:16 crc kubenswrapper[4775]: I1216 15:36:16.935479 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.040615 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f"] Dec 16 15:36:17 crc kubenswrapper[4775]: E1216 15:36:17.041026 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0570786e-5fec-43cf-b7ec-12a4facea06d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.041042 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0570786e-5fec-43cf-b7ec-12a4facea06d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.041235 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0570786e-5fec-43cf-b7ec-12a4facea06d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.041839 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.044621 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.044697 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tgv5f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.044732 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.045029 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.045230 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.045428 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.045581 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.052749 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f"] Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.186578 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.186987 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.187084 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.187193 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.187265 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.187339 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbs7z\" (UniqueName: \"kubernetes.io/projected/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-kube-api-access-kbs7z\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.187470 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.187554 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.187716 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.289017 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.289092 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.289168 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.289213 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.289241 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.289273 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.289315 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.289341 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.289368 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbs7z\" (UniqueName: \"kubernetes.io/projected/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-kube-api-access-kbs7z\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.290714 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.295730 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.299234 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.299600 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.305411 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.306706 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.309257 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.313542 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.342667 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbs7z\" (UniqueName: \"kubernetes.io/projected/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-kube-api-access-kbs7z\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nnz9f\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.371349 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.892563 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f"] Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.901268 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 15:36:17 crc kubenswrapper[4775]: I1216 15:36:17.943841 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" event={"ID":"e3ac9c58-f9b2-4b76-baec-dc50c94c8185","Type":"ContainerStarted","Data":"1cbc2a870e0b20ad669f94a550cbbf3c4120c72cca7d16438574f73d22f525b4"} Dec 16 15:36:18 crc kubenswrapper[4775]: I1216 15:36:18.958585 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" event={"ID":"e3ac9c58-f9b2-4b76-baec-dc50c94c8185","Type":"ContainerStarted","Data":"2b9882400ea58347a9db4b462086e7aa6cff69f4dd93cdf3590d9a56ded53d17"} Dec 16 15:36:18 crc kubenswrapper[4775]: I1216 15:36:18.993190 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" podStartSLOduration=1.57236837 podStartE2EDuration="1.993170915s" podCreationTimestamp="2025-12-16 15:36:17 +0000 UTC" firstStartedPulling="2025-12-16 15:36:17.901042948 +0000 UTC m=+2502.852121871" lastFinishedPulling="2025-12-16 15:36:18.321845493 +0000 UTC m=+2503.272924416" observedRunningTime="2025-12-16 15:36:18.978589325 +0000 UTC m=+2503.929668278" watchObservedRunningTime="2025-12-16 15:36:18.993170915 +0000 UTC m=+2503.944249838" Dec 16 15:36:26 crc kubenswrapper[4775]: I1216 15:36:26.338314 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:36:26 crc kubenswrapper[4775]: E1216 15:36:26.338992 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:36:40 crc kubenswrapper[4775]: I1216 15:36:40.337728 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:36:40 crc kubenswrapper[4775]: E1216 15:36:40.338460 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:36:54 crc kubenswrapper[4775]: I1216 15:36:54.337958 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:36:54 crc kubenswrapper[4775]: E1216 15:36:54.339260 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:37:09 crc kubenswrapper[4775]: I1216 15:37:09.338411 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:37:09 crc kubenswrapper[4775]: E1216 15:37:09.340404 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:37:24 crc kubenswrapper[4775]: I1216 15:37:24.338251 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:37:24 crc kubenswrapper[4775]: E1216 15:37:24.338897 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:37:39 crc kubenswrapper[4775]: I1216 15:37:39.340259 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:37:39 crc kubenswrapper[4775]: E1216 15:37:39.341135 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:37:53 crc kubenswrapper[4775]: I1216 15:37:53.337734 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:37:53 crc kubenswrapper[4775]: E1216 15:37:53.338548 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:38:04 crc kubenswrapper[4775]: I1216 15:38:04.337803 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:38:04 crc kubenswrapper[4775]: E1216 15:38:04.338552 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:38:18 crc kubenswrapper[4775]: I1216 15:38:18.345648 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:38:18 crc kubenswrapper[4775]: E1216 15:38:18.346553 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:38:33 crc kubenswrapper[4775]: I1216 15:38:33.339394 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:38:33 crc kubenswrapper[4775]: E1216 15:38:33.340154 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:38:48 crc kubenswrapper[4775]: I1216 15:38:48.338439 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:38:48 crc kubenswrapper[4775]: E1216 15:38:48.339156 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:38:55 crc kubenswrapper[4775]: I1216 15:38:55.693173 4775 generic.go:334] "Generic (PLEG): container finished" podID="e3ac9c58-f9b2-4b76-baec-dc50c94c8185" containerID="2b9882400ea58347a9db4b462086e7aa6cff69f4dd93cdf3590d9a56ded53d17" exitCode=0 Dec 16 15:38:55 crc kubenswrapper[4775]: I1216 15:38:55.693253 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" event={"ID":"e3ac9c58-f9b2-4b76-baec-dc50c94c8185","Type":"ContainerDied","Data":"2b9882400ea58347a9db4b462086e7aa6cff69f4dd93cdf3590d9a56ded53d17"} Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.087779 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.241387 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbs7z\" (UniqueName: \"kubernetes.io/projected/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-kube-api-access-kbs7z\") pod \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.241492 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-cell1-compute-config-1\") pod \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.241542 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-migration-ssh-key-0\") pod \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.241573 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-migration-ssh-key-1\") pod \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.241611 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-combined-ca-bundle\") pod \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.241638 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-ssh-key\") pod \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.241852 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-extra-config-0\") pod \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.241919 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-inventory\") pod \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.241956 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-cell1-compute-config-0\") pod \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\" (UID: \"e3ac9c58-f9b2-4b76-baec-dc50c94c8185\") " Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.248578 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-kube-api-access-kbs7z" (OuterVolumeSpecName: "kube-api-access-kbs7z") pod "e3ac9c58-f9b2-4b76-baec-dc50c94c8185" (UID: "e3ac9c58-f9b2-4b76-baec-dc50c94c8185"). InnerVolumeSpecName "kube-api-access-kbs7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.249352 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e3ac9c58-f9b2-4b76-baec-dc50c94c8185" (UID: "e3ac9c58-f9b2-4b76-baec-dc50c94c8185"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.273384 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "e3ac9c58-f9b2-4b76-baec-dc50c94c8185" (UID: "e3ac9c58-f9b2-4b76-baec-dc50c94c8185"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.274700 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-inventory" (OuterVolumeSpecName: "inventory") pod "e3ac9c58-f9b2-4b76-baec-dc50c94c8185" (UID: "e3ac9c58-f9b2-4b76-baec-dc50c94c8185"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.275640 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "e3ac9c58-f9b2-4b76-baec-dc50c94c8185" (UID: "e3ac9c58-f9b2-4b76-baec-dc50c94c8185"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.275845 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "e3ac9c58-f9b2-4b76-baec-dc50c94c8185" (UID: "e3ac9c58-f9b2-4b76-baec-dc50c94c8185"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.284958 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e3ac9c58-f9b2-4b76-baec-dc50c94c8185" (UID: "e3ac9c58-f9b2-4b76-baec-dc50c94c8185"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.287574 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "e3ac9c58-f9b2-4b76-baec-dc50c94c8185" (UID: "e3ac9c58-f9b2-4b76-baec-dc50c94c8185"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.289181 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "e3ac9c58-f9b2-4b76-baec-dc50c94c8185" (UID: "e3ac9c58-f9b2-4b76-baec-dc50c94c8185"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.343942 4775 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.344253 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbs7z\" (UniqueName: \"kubernetes.io/projected/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-kube-api-access-kbs7z\") on node \"crc\" DevicePath \"\"" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.344313 4775 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.344366 4775 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.344452 4775 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.344516 4775 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.344606 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.344688 4775 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.344763 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ac9c58-f9b2-4b76-baec-dc50c94c8185-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.712738 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" event={"ID":"e3ac9c58-f9b2-4b76-baec-dc50c94c8185","Type":"ContainerDied","Data":"1cbc2a870e0b20ad669f94a550cbbf3c4120c72cca7d16438574f73d22f525b4"} Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.712787 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cbc2a870e0b20ad669f94a550cbbf3c4120c72cca7d16438574f73d22f525b4" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.712857 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nnz9f" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.830099 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929"] Dec 16 15:38:57 crc kubenswrapper[4775]: E1216 15:38:57.830838 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ac9c58-f9b2-4b76-baec-dc50c94c8185" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.830860 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ac9c58-f9b2-4b76-baec-dc50c94c8185" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.831124 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ac9c58-f9b2-4b76-baec-dc50c94c8185" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.831769 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.835180 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.835507 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.835686 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.835799 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.838304 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tgv5f" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.844264 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929"] Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.955276 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.955321 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.955381 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpgnr\" (UniqueName: \"kubernetes.io/projected/533cc620-42ce-4262-bcfe-25c8ebe74ff6-kube-api-access-jpgnr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.955399 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.955431 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.955449 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:57 crc kubenswrapper[4775]: I1216 15:38:57.955475 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:58 crc kubenswrapper[4775]: I1216 15:38:58.057283 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:58 crc kubenswrapper[4775]: I1216 15:38:58.057327 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:58 crc kubenswrapper[4775]: I1216 15:38:58.057394 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpgnr\" (UniqueName: \"kubernetes.io/projected/533cc620-42ce-4262-bcfe-25c8ebe74ff6-kube-api-access-jpgnr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:58 crc kubenswrapper[4775]: I1216 15:38:58.057419 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:58 crc kubenswrapper[4775]: I1216 15:38:58.057456 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:58 crc kubenswrapper[4775]: I1216 15:38:58.057478 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:58 crc kubenswrapper[4775]: I1216 15:38:58.057514 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:58 crc kubenswrapper[4775]: I1216 15:38:58.061617 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:58 crc kubenswrapper[4775]: I1216 15:38:58.062117 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:58 crc kubenswrapper[4775]: I1216 15:38:58.062487 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:58 crc kubenswrapper[4775]: I1216 15:38:58.062876 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:58 crc kubenswrapper[4775]: I1216 15:38:58.063045 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:58 crc kubenswrapper[4775]: I1216 15:38:58.064543 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:58 crc kubenswrapper[4775]: I1216 15:38:58.076661 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpgnr\" (UniqueName: \"kubernetes.io/projected/533cc620-42ce-4262-bcfe-25c8ebe74ff6-kube-api-access-jpgnr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9n929\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:58 crc kubenswrapper[4775]: I1216 15:38:58.151582 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:38:58 crc kubenswrapper[4775]: I1216 15:38:58.694960 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929"] Dec 16 15:38:58 crc kubenswrapper[4775]: I1216 15:38:58.723876 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" event={"ID":"533cc620-42ce-4262-bcfe-25c8ebe74ff6","Type":"ContainerStarted","Data":"7c70fb48387bcc31b16d66356f7ca3d403a9d9822fa9b0614c4cf2f2e9074da7"} Dec 16 15:38:59 crc kubenswrapper[4775]: I1216 15:38:59.338293 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:38:59 crc kubenswrapper[4775]: E1216 15:38:59.338684 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:39:00 crc kubenswrapper[4775]: I1216 15:39:00.740975 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" event={"ID":"533cc620-42ce-4262-bcfe-25c8ebe74ff6","Type":"ContainerStarted","Data":"c59ca338db88c7f31220e819463c2205e4774f0aff112d61ca23d8705170035f"} Dec 16 15:39:00 crc kubenswrapper[4775]: I1216 15:39:00.761963 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" podStartSLOduration=2.924757037 podStartE2EDuration="3.761947056s" podCreationTimestamp="2025-12-16 15:38:57 +0000 UTC" firstStartedPulling="2025-12-16 15:38:58.700853865 +0000 UTC m=+2663.651932788" lastFinishedPulling="2025-12-16 15:38:59.538043874 +0000 UTC m=+2664.489122807" observedRunningTime="2025-12-16 15:39:00.759194545 +0000 UTC m=+2665.710273468" watchObservedRunningTime="2025-12-16 15:39:00.761947056 +0000 UTC m=+2665.713025979" Dec 16 15:39:12 crc kubenswrapper[4775]: I1216 15:39:12.381150 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:39:12 crc kubenswrapper[4775]: E1216 15:39:12.381832 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:39:15 crc kubenswrapper[4775]: I1216 15:39:15.146264 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sftrm"] Dec 16 15:39:15 crc kubenswrapper[4775]: I1216 15:39:15.149456 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sftrm" Dec 16 15:39:15 crc kubenswrapper[4775]: I1216 15:39:15.159155 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sftrm"] Dec 16 15:39:15 crc kubenswrapper[4775]: I1216 15:39:15.258578 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75473199-f04b-4306-80d2-d7201eac3739-catalog-content\") pod \"redhat-operators-sftrm\" (UID: \"75473199-f04b-4306-80d2-d7201eac3739\") " pod="openshift-marketplace/redhat-operators-sftrm" Dec 16 15:39:15 crc kubenswrapper[4775]: I1216 15:39:15.258744 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75473199-f04b-4306-80d2-d7201eac3739-utilities\") pod \"redhat-operators-sftrm\" (UID: \"75473199-f04b-4306-80d2-d7201eac3739\") " pod="openshift-marketplace/redhat-operators-sftrm" Dec 16 15:39:15 crc kubenswrapper[4775]: I1216 15:39:15.258786 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7b9g\" (UniqueName: \"kubernetes.io/projected/75473199-f04b-4306-80d2-d7201eac3739-kube-api-access-r7b9g\") pod \"redhat-operators-sftrm\" (UID: \"75473199-f04b-4306-80d2-d7201eac3739\") " pod="openshift-marketplace/redhat-operators-sftrm" Dec 16 15:39:15 crc kubenswrapper[4775]: I1216 15:39:15.362522 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75473199-f04b-4306-80d2-d7201eac3739-catalog-content\") pod \"redhat-operators-sftrm\" (UID: \"75473199-f04b-4306-80d2-d7201eac3739\") " pod="openshift-marketplace/redhat-operators-sftrm" Dec 16 15:39:15 crc kubenswrapper[4775]: I1216 15:39:15.362977 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75473199-f04b-4306-80d2-d7201eac3739-catalog-content\") pod \"redhat-operators-sftrm\" (UID: \"75473199-f04b-4306-80d2-d7201eac3739\") " pod="openshift-marketplace/redhat-operators-sftrm" Dec 16 15:39:15 crc kubenswrapper[4775]: I1216 15:39:15.363524 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75473199-f04b-4306-80d2-d7201eac3739-utilities\") pod \"redhat-operators-sftrm\" (UID: \"75473199-f04b-4306-80d2-d7201eac3739\") " pod="openshift-marketplace/redhat-operators-sftrm" Dec 16 15:39:15 crc kubenswrapper[4775]: I1216 15:39:15.363561 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7b9g\" (UniqueName: \"kubernetes.io/projected/75473199-f04b-4306-80d2-d7201eac3739-kube-api-access-r7b9g\") pod \"redhat-operators-sftrm\" (UID: \"75473199-f04b-4306-80d2-d7201eac3739\") " pod="openshift-marketplace/redhat-operators-sftrm" Dec 16 15:39:15 crc kubenswrapper[4775]: I1216 15:39:15.364099 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75473199-f04b-4306-80d2-d7201eac3739-utilities\") pod \"redhat-operators-sftrm\" (UID: \"75473199-f04b-4306-80d2-d7201eac3739\") " pod="openshift-marketplace/redhat-operators-sftrm" Dec 16 15:39:15 crc kubenswrapper[4775]: I1216 15:39:15.397108 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7b9g\" (UniqueName: \"kubernetes.io/projected/75473199-f04b-4306-80d2-d7201eac3739-kube-api-access-r7b9g\") pod \"redhat-operators-sftrm\" (UID: \"75473199-f04b-4306-80d2-d7201eac3739\") " pod="openshift-marketplace/redhat-operators-sftrm" Dec 16 15:39:15 crc kubenswrapper[4775]: I1216 15:39:15.482617 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sftrm" Dec 16 15:39:16 crc kubenswrapper[4775]: I1216 15:39:16.398762 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sftrm"] Dec 16 15:39:17 crc kubenswrapper[4775]: I1216 15:39:17.370592 4775 generic.go:334] "Generic (PLEG): container finished" podID="75473199-f04b-4306-80d2-d7201eac3739" containerID="c661452589535985d3fd3a4b0a1262e67cb19e4a0f8f23cc958cd41537ee2607" exitCode=0 Dec 16 15:39:17 crc kubenswrapper[4775]: I1216 15:39:17.370715 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sftrm" event={"ID":"75473199-f04b-4306-80d2-d7201eac3739","Type":"ContainerDied","Data":"c661452589535985d3fd3a4b0a1262e67cb19e4a0f8f23cc958cd41537ee2607"} Dec 16 15:39:17 crc kubenswrapper[4775]: I1216 15:39:17.370967 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sftrm" event={"ID":"75473199-f04b-4306-80d2-d7201eac3739","Type":"ContainerStarted","Data":"b97ea4d71c2e77fcded5907f0755d4d2de28055c79ee03122cafa88e9b1845cd"} Dec 16 15:39:19 crc kubenswrapper[4775]: I1216 15:39:19.393742 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sftrm" event={"ID":"75473199-f04b-4306-80d2-d7201eac3739","Type":"ContainerStarted","Data":"c437c922d4150a52129c0946df9ad5b9a1536b15d04324f70910ce73f0cb769e"} Dec 16 15:39:22 crc kubenswrapper[4775]: I1216 15:39:22.825948 4775 generic.go:334] "Generic (PLEG): container finished" podID="75473199-f04b-4306-80d2-d7201eac3739" containerID="c437c922d4150a52129c0946df9ad5b9a1536b15d04324f70910ce73f0cb769e" exitCode=0 Dec 16 15:39:22 crc kubenswrapper[4775]: I1216 15:39:22.825996 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sftrm" event={"ID":"75473199-f04b-4306-80d2-d7201eac3739","Type":"ContainerDied","Data":"c437c922d4150a52129c0946df9ad5b9a1536b15d04324f70910ce73f0cb769e"} Dec 16 15:39:23 crc kubenswrapper[4775]: I1216 15:39:23.338519 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:39:23 crc kubenswrapper[4775]: E1216 15:39:23.339128 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:39:25 crc kubenswrapper[4775]: I1216 15:39:25.917109 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sftrm" event={"ID":"75473199-f04b-4306-80d2-d7201eac3739","Type":"ContainerStarted","Data":"d80a5117cca9e5243359696e89334c45edda1559c81efbfe5c261c46cb0ef4aa"} Dec 16 15:39:25 crc kubenswrapper[4775]: I1216 15:39:25.934227 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sftrm" podStartSLOduration=3.323779805 podStartE2EDuration="10.934208085s" podCreationTimestamp="2025-12-16 15:39:15 +0000 UTC" firstStartedPulling="2025-12-16 15:39:17.373384781 +0000 UTC m=+2682.324463694" lastFinishedPulling="2025-12-16 15:39:24.983813051 +0000 UTC m=+2689.934891974" observedRunningTime="2025-12-16 15:39:25.93235024 +0000 UTC m=+2690.883429163" watchObservedRunningTime="2025-12-16 15:39:25.934208085 +0000 UTC m=+2690.885287008" Dec 16 15:39:35 crc kubenswrapper[4775]: I1216 15:39:35.484836 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sftrm" Dec 16 15:39:35 crc kubenswrapper[4775]: I1216 15:39:35.485432 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sftrm" Dec 16 15:39:35 crc kubenswrapper[4775]: I1216 15:39:35.529500 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sftrm" Dec 16 15:39:36 crc kubenswrapper[4775]: I1216 15:39:36.058069 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sftrm" Dec 16 15:39:36 crc kubenswrapper[4775]: I1216 15:39:36.102379 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sftrm"] Dec 16 15:39:37 crc kubenswrapper[4775]: I1216 15:39:37.337730 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:39:38 crc kubenswrapper[4775]: I1216 15:39:38.027689 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerStarted","Data":"4729b3904585254169a82d5de79bd940893d6ca623bcb8a4ed43c5b86e8831f4"} Dec 16 15:39:38 crc kubenswrapper[4775]: I1216 15:39:38.027805 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sftrm" podUID="75473199-f04b-4306-80d2-d7201eac3739" containerName="registry-server" containerID="cri-o://d80a5117cca9e5243359696e89334c45edda1559c81efbfe5c261c46cb0ef4aa" gracePeriod=2 Dec 16 15:39:38 crc kubenswrapper[4775]: I1216 15:39:38.459647 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sftrm" Dec 16 15:39:38 crc kubenswrapper[4775]: I1216 15:39:38.476544 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75473199-f04b-4306-80d2-d7201eac3739-utilities\") pod \"75473199-f04b-4306-80d2-d7201eac3739\" (UID: \"75473199-f04b-4306-80d2-d7201eac3739\") " Dec 16 15:39:38 crc kubenswrapper[4775]: I1216 15:39:38.476627 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75473199-f04b-4306-80d2-d7201eac3739-catalog-content\") pod \"75473199-f04b-4306-80d2-d7201eac3739\" (UID: \"75473199-f04b-4306-80d2-d7201eac3739\") " Dec 16 15:39:38 crc kubenswrapper[4775]: I1216 15:39:38.476765 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7b9g\" (UniqueName: \"kubernetes.io/projected/75473199-f04b-4306-80d2-d7201eac3739-kube-api-access-r7b9g\") pod \"75473199-f04b-4306-80d2-d7201eac3739\" (UID: \"75473199-f04b-4306-80d2-d7201eac3739\") " Dec 16 15:39:38 crc kubenswrapper[4775]: I1216 15:39:38.478588 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75473199-f04b-4306-80d2-d7201eac3739-utilities" (OuterVolumeSpecName: "utilities") pod "75473199-f04b-4306-80d2-d7201eac3739" (UID: "75473199-f04b-4306-80d2-d7201eac3739"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:39:38 crc kubenswrapper[4775]: I1216 15:39:38.486997 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75473199-f04b-4306-80d2-d7201eac3739-kube-api-access-r7b9g" (OuterVolumeSpecName: "kube-api-access-r7b9g") pod "75473199-f04b-4306-80d2-d7201eac3739" (UID: "75473199-f04b-4306-80d2-d7201eac3739"). InnerVolumeSpecName "kube-api-access-r7b9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:39:38 crc kubenswrapper[4775]: I1216 15:39:38.578579 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75473199-f04b-4306-80d2-d7201eac3739-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:39:38 crc kubenswrapper[4775]: I1216 15:39:38.578619 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7b9g\" (UniqueName: \"kubernetes.io/projected/75473199-f04b-4306-80d2-d7201eac3739-kube-api-access-r7b9g\") on node \"crc\" DevicePath \"\"" Dec 16 15:39:38 crc kubenswrapper[4775]: I1216 15:39:38.594703 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75473199-f04b-4306-80d2-d7201eac3739-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75473199-f04b-4306-80d2-d7201eac3739" (UID: "75473199-f04b-4306-80d2-d7201eac3739"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:39:38 crc kubenswrapper[4775]: I1216 15:39:38.680058 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75473199-f04b-4306-80d2-d7201eac3739-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:39:39 crc kubenswrapper[4775]: I1216 15:39:39.037420 4775 generic.go:334] "Generic (PLEG): container finished" podID="75473199-f04b-4306-80d2-d7201eac3739" containerID="d80a5117cca9e5243359696e89334c45edda1559c81efbfe5c261c46cb0ef4aa" exitCode=0 Dec 16 15:39:39 crc kubenswrapper[4775]: I1216 15:39:39.037462 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sftrm" event={"ID":"75473199-f04b-4306-80d2-d7201eac3739","Type":"ContainerDied","Data":"d80a5117cca9e5243359696e89334c45edda1559c81efbfe5c261c46cb0ef4aa"} Dec 16 15:39:39 crc kubenswrapper[4775]: I1216 15:39:39.037843 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sftrm" event={"ID":"75473199-f04b-4306-80d2-d7201eac3739","Type":"ContainerDied","Data":"b97ea4d71c2e77fcded5907f0755d4d2de28055c79ee03122cafa88e9b1845cd"} Dec 16 15:39:39 crc kubenswrapper[4775]: I1216 15:39:39.037867 4775 scope.go:117] "RemoveContainer" containerID="d80a5117cca9e5243359696e89334c45edda1559c81efbfe5c261c46cb0ef4aa" Dec 16 15:39:39 crc kubenswrapper[4775]: I1216 15:39:39.037533 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sftrm" Dec 16 15:39:39 crc kubenswrapper[4775]: I1216 15:39:39.060617 4775 scope.go:117] "RemoveContainer" containerID="c437c922d4150a52129c0946df9ad5b9a1536b15d04324f70910ce73f0cb769e" Dec 16 15:39:39 crc kubenswrapper[4775]: I1216 15:39:39.072782 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sftrm"] Dec 16 15:39:39 crc kubenswrapper[4775]: I1216 15:39:39.080210 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sftrm"] Dec 16 15:39:39 crc kubenswrapper[4775]: I1216 15:39:39.100909 4775 scope.go:117] "RemoveContainer" containerID="c661452589535985d3fd3a4b0a1262e67cb19e4a0f8f23cc958cd41537ee2607" Dec 16 15:39:39 crc kubenswrapper[4775]: I1216 15:39:39.142288 4775 scope.go:117] "RemoveContainer" containerID="d80a5117cca9e5243359696e89334c45edda1559c81efbfe5c261c46cb0ef4aa" Dec 16 15:39:39 crc kubenswrapper[4775]: E1216 15:39:39.142730 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d80a5117cca9e5243359696e89334c45edda1559c81efbfe5c261c46cb0ef4aa\": container with ID starting with d80a5117cca9e5243359696e89334c45edda1559c81efbfe5c261c46cb0ef4aa not found: ID does not exist" containerID="d80a5117cca9e5243359696e89334c45edda1559c81efbfe5c261c46cb0ef4aa" Dec 16 15:39:39 crc kubenswrapper[4775]: I1216 15:39:39.142774 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d80a5117cca9e5243359696e89334c45edda1559c81efbfe5c261c46cb0ef4aa"} err="failed to get container status \"d80a5117cca9e5243359696e89334c45edda1559c81efbfe5c261c46cb0ef4aa\": rpc error: code = NotFound desc = could not find container \"d80a5117cca9e5243359696e89334c45edda1559c81efbfe5c261c46cb0ef4aa\": container with ID starting with d80a5117cca9e5243359696e89334c45edda1559c81efbfe5c261c46cb0ef4aa not found: ID does not exist" Dec 16 15:39:39 crc kubenswrapper[4775]: I1216 15:39:39.142821 4775 scope.go:117] "RemoveContainer" containerID="c437c922d4150a52129c0946df9ad5b9a1536b15d04324f70910ce73f0cb769e" Dec 16 15:39:39 crc kubenswrapper[4775]: E1216 15:39:39.144043 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c437c922d4150a52129c0946df9ad5b9a1536b15d04324f70910ce73f0cb769e\": container with ID starting with c437c922d4150a52129c0946df9ad5b9a1536b15d04324f70910ce73f0cb769e not found: ID does not exist" containerID="c437c922d4150a52129c0946df9ad5b9a1536b15d04324f70910ce73f0cb769e" Dec 16 15:39:39 crc kubenswrapper[4775]: I1216 15:39:39.144090 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c437c922d4150a52129c0946df9ad5b9a1536b15d04324f70910ce73f0cb769e"} err="failed to get container status \"c437c922d4150a52129c0946df9ad5b9a1536b15d04324f70910ce73f0cb769e\": rpc error: code = NotFound desc = could not find container \"c437c922d4150a52129c0946df9ad5b9a1536b15d04324f70910ce73f0cb769e\": container with ID starting with c437c922d4150a52129c0946df9ad5b9a1536b15d04324f70910ce73f0cb769e not found: ID does not exist" Dec 16 15:39:39 crc kubenswrapper[4775]: I1216 15:39:39.144116 4775 scope.go:117] "RemoveContainer" containerID="c661452589535985d3fd3a4b0a1262e67cb19e4a0f8f23cc958cd41537ee2607" Dec 16 15:39:39 crc kubenswrapper[4775]: E1216 15:39:39.144435 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c661452589535985d3fd3a4b0a1262e67cb19e4a0f8f23cc958cd41537ee2607\": container with ID starting with c661452589535985d3fd3a4b0a1262e67cb19e4a0f8f23cc958cd41537ee2607 not found: ID does not exist" containerID="c661452589535985d3fd3a4b0a1262e67cb19e4a0f8f23cc958cd41537ee2607" Dec 16 15:39:39 crc kubenswrapper[4775]: I1216 15:39:39.144471 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c661452589535985d3fd3a4b0a1262e67cb19e4a0f8f23cc958cd41537ee2607"} err="failed to get container status \"c661452589535985d3fd3a4b0a1262e67cb19e4a0f8f23cc958cd41537ee2607\": rpc error: code = NotFound desc = could not find container \"c661452589535985d3fd3a4b0a1262e67cb19e4a0f8f23cc958cd41537ee2607\": container with ID starting with c661452589535985d3fd3a4b0a1262e67cb19e4a0f8f23cc958cd41537ee2607 not found: ID does not exist" Dec 16 15:39:39 crc kubenswrapper[4775]: I1216 15:39:39.349793 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75473199-f04b-4306-80d2-d7201eac3739" path="/var/lib/kubelet/pods/75473199-f04b-4306-80d2-d7201eac3739/volumes" Dec 16 15:40:28 crc kubenswrapper[4775]: I1216 15:40:28.846006 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jmhtx"] Dec 16 15:40:28 crc kubenswrapper[4775]: E1216 15:40:28.847241 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75473199-f04b-4306-80d2-d7201eac3739" containerName="extract-utilities" Dec 16 15:40:28 crc kubenswrapper[4775]: I1216 15:40:28.847266 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="75473199-f04b-4306-80d2-d7201eac3739" containerName="extract-utilities" Dec 16 15:40:28 crc kubenswrapper[4775]: E1216 15:40:28.847305 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75473199-f04b-4306-80d2-d7201eac3739" containerName="extract-content" Dec 16 15:40:28 crc kubenswrapper[4775]: I1216 15:40:28.847317 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="75473199-f04b-4306-80d2-d7201eac3739" containerName="extract-content" Dec 16 15:40:28 crc kubenswrapper[4775]: E1216 15:40:28.847342 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75473199-f04b-4306-80d2-d7201eac3739" containerName="registry-server" Dec 16 15:40:28 crc kubenswrapper[4775]: I1216 15:40:28.847354 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="75473199-f04b-4306-80d2-d7201eac3739" containerName="registry-server" Dec 16 15:40:28 crc kubenswrapper[4775]: I1216 15:40:28.847713 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="75473199-f04b-4306-80d2-d7201eac3739" containerName="registry-server" Dec 16 15:40:28 crc kubenswrapper[4775]: I1216 15:40:28.849879 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jmhtx" Dec 16 15:40:28 crc kubenswrapper[4775]: I1216 15:40:28.873025 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jmhtx"] Dec 16 15:40:28 crc kubenswrapper[4775]: I1216 15:40:28.978258 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xfk6\" (UniqueName: \"kubernetes.io/projected/c0751bbc-1693-4de3-a2c7-d522fee65730-kube-api-access-9xfk6\") pod \"certified-operators-jmhtx\" (UID: \"c0751bbc-1693-4de3-a2c7-d522fee65730\") " pod="openshift-marketplace/certified-operators-jmhtx" Dec 16 15:40:28 crc kubenswrapper[4775]: I1216 15:40:28.978591 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0751bbc-1693-4de3-a2c7-d522fee65730-catalog-content\") pod \"certified-operators-jmhtx\" (UID: \"c0751bbc-1693-4de3-a2c7-d522fee65730\") " pod="openshift-marketplace/certified-operators-jmhtx" Dec 16 15:40:28 crc kubenswrapper[4775]: I1216 15:40:28.978681 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0751bbc-1693-4de3-a2c7-d522fee65730-utilities\") pod \"certified-operators-jmhtx\" (UID: \"c0751bbc-1693-4de3-a2c7-d522fee65730\") " pod="openshift-marketplace/certified-operators-jmhtx" Dec 16 15:40:29 crc kubenswrapper[4775]: I1216 15:40:29.080430 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0751bbc-1693-4de3-a2c7-d522fee65730-utilities\") pod \"certified-operators-jmhtx\" (UID: \"c0751bbc-1693-4de3-a2c7-d522fee65730\") " pod="openshift-marketplace/certified-operators-jmhtx" Dec 16 15:40:29 crc kubenswrapper[4775]: I1216 15:40:29.080601 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xfk6\" (UniqueName: \"kubernetes.io/projected/c0751bbc-1693-4de3-a2c7-d522fee65730-kube-api-access-9xfk6\") pod \"certified-operators-jmhtx\" (UID: \"c0751bbc-1693-4de3-a2c7-d522fee65730\") " pod="openshift-marketplace/certified-operators-jmhtx" Dec 16 15:40:29 crc kubenswrapper[4775]: I1216 15:40:29.080629 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0751bbc-1693-4de3-a2c7-d522fee65730-catalog-content\") pod \"certified-operators-jmhtx\" (UID: \"c0751bbc-1693-4de3-a2c7-d522fee65730\") " pod="openshift-marketplace/certified-operators-jmhtx" Dec 16 15:40:29 crc kubenswrapper[4775]: I1216 15:40:29.081219 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0751bbc-1693-4de3-a2c7-d522fee65730-catalog-content\") pod \"certified-operators-jmhtx\" (UID: \"c0751bbc-1693-4de3-a2c7-d522fee65730\") " pod="openshift-marketplace/certified-operators-jmhtx" Dec 16 15:40:29 crc kubenswrapper[4775]: I1216 15:40:29.081457 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0751bbc-1693-4de3-a2c7-d522fee65730-utilities\") pod \"certified-operators-jmhtx\" (UID: \"c0751bbc-1693-4de3-a2c7-d522fee65730\") " pod="openshift-marketplace/certified-operators-jmhtx" Dec 16 15:40:29 crc kubenswrapper[4775]: I1216 15:40:29.115255 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xfk6\" (UniqueName: \"kubernetes.io/projected/c0751bbc-1693-4de3-a2c7-d522fee65730-kube-api-access-9xfk6\") pod \"certified-operators-jmhtx\" (UID: \"c0751bbc-1693-4de3-a2c7-d522fee65730\") " pod="openshift-marketplace/certified-operators-jmhtx" Dec 16 15:40:29 crc kubenswrapper[4775]: I1216 15:40:29.170685 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jmhtx" Dec 16 15:40:29 crc kubenswrapper[4775]: I1216 15:40:29.812954 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jmhtx"] Dec 16 15:40:30 crc kubenswrapper[4775]: I1216 15:40:30.497903 4775 generic.go:334] "Generic (PLEG): container finished" podID="c0751bbc-1693-4de3-a2c7-d522fee65730" containerID="314914c32f5657c2a0849ddfd05b10edd26775a5c0afd76684735b59a8a47615" exitCode=0 Dec 16 15:40:30 crc kubenswrapper[4775]: I1216 15:40:30.497981 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jmhtx" event={"ID":"c0751bbc-1693-4de3-a2c7-d522fee65730","Type":"ContainerDied","Data":"314914c32f5657c2a0849ddfd05b10edd26775a5c0afd76684735b59a8a47615"} Dec 16 15:40:30 crc kubenswrapper[4775]: I1216 15:40:30.500191 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jmhtx" event={"ID":"c0751bbc-1693-4de3-a2c7-d522fee65730","Type":"ContainerStarted","Data":"3933fb0a84f612b61ed1a50dddcdd39c773cb2bd6a4453fe27bc458f2c06b60e"} Dec 16 15:40:31 crc kubenswrapper[4775]: I1216 15:40:31.042185 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8969g"] Dec 16 15:40:31 crc kubenswrapper[4775]: I1216 15:40:31.044460 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8969g" Dec 16 15:40:31 crc kubenswrapper[4775]: I1216 15:40:31.070143 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8969g"] Dec 16 15:40:31 crc kubenswrapper[4775]: I1216 15:40:31.127974 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ae4b06-fd52-41a4-9111-bfbb4347444f-catalog-content\") pod \"redhat-marketplace-8969g\" (UID: \"a8ae4b06-fd52-41a4-9111-bfbb4347444f\") " pod="openshift-marketplace/redhat-marketplace-8969g" Dec 16 15:40:31 crc kubenswrapper[4775]: I1216 15:40:31.128031 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ae4b06-fd52-41a4-9111-bfbb4347444f-utilities\") pod \"redhat-marketplace-8969g\" (UID: \"a8ae4b06-fd52-41a4-9111-bfbb4347444f\") " pod="openshift-marketplace/redhat-marketplace-8969g" Dec 16 15:40:31 crc kubenswrapper[4775]: I1216 15:40:31.128088 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj24b\" (UniqueName: \"kubernetes.io/projected/a8ae4b06-fd52-41a4-9111-bfbb4347444f-kube-api-access-pj24b\") pod \"redhat-marketplace-8969g\" (UID: \"a8ae4b06-fd52-41a4-9111-bfbb4347444f\") " pod="openshift-marketplace/redhat-marketplace-8969g" Dec 16 15:40:31 crc kubenswrapper[4775]: I1216 15:40:31.230478 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ae4b06-fd52-41a4-9111-bfbb4347444f-catalog-content\") pod \"redhat-marketplace-8969g\" (UID: \"a8ae4b06-fd52-41a4-9111-bfbb4347444f\") " pod="openshift-marketplace/redhat-marketplace-8969g" Dec 16 15:40:31 crc kubenswrapper[4775]: I1216 15:40:31.230797 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ae4b06-fd52-41a4-9111-bfbb4347444f-utilities\") pod \"redhat-marketplace-8969g\" (UID: \"a8ae4b06-fd52-41a4-9111-bfbb4347444f\") " pod="openshift-marketplace/redhat-marketplace-8969g" Dec 16 15:40:31 crc kubenswrapper[4775]: I1216 15:40:31.231520 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj24b\" (UniqueName: \"kubernetes.io/projected/a8ae4b06-fd52-41a4-9111-bfbb4347444f-kube-api-access-pj24b\") pod \"redhat-marketplace-8969g\" (UID: \"a8ae4b06-fd52-41a4-9111-bfbb4347444f\") " pod="openshift-marketplace/redhat-marketplace-8969g" Dec 16 15:40:31 crc kubenswrapper[4775]: I1216 15:40:31.231108 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ae4b06-fd52-41a4-9111-bfbb4347444f-catalog-content\") pod \"redhat-marketplace-8969g\" (UID: \"a8ae4b06-fd52-41a4-9111-bfbb4347444f\") " pod="openshift-marketplace/redhat-marketplace-8969g" Dec 16 15:40:31 crc kubenswrapper[4775]: I1216 15:40:31.231400 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ae4b06-fd52-41a4-9111-bfbb4347444f-utilities\") pod \"redhat-marketplace-8969g\" (UID: \"a8ae4b06-fd52-41a4-9111-bfbb4347444f\") " pod="openshift-marketplace/redhat-marketplace-8969g" Dec 16 15:40:31 crc kubenswrapper[4775]: I1216 15:40:31.254602 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj24b\" (UniqueName: \"kubernetes.io/projected/a8ae4b06-fd52-41a4-9111-bfbb4347444f-kube-api-access-pj24b\") pod \"redhat-marketplace-8969g\" (UID: \"a8ae4b06-fd52-41a4-9111-bfbb4347444f\") " pod="openshift-marketplace/redhat-marketplace-8969g" Dec 16 15:40:31 crc kubenswrapper[4775]: I1216 15:40:31.370323 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8969g" Dec 16 15:40:31 crc kubenswrapper[4775]: I1216 15:40:31.587505 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jmhtx" event={"ID":"c0751bbc-1693-4de3-a2c7-d522fee65730","Type":"ContainerStarted","Data":"9c3575f9de27cff02bf4329ade60be87846e8f63fb5d19dbf82015c209549902"} Dec 16 15:40:31 crc kubenswrapper[4775]: I1216 15:40:31.890374 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8969g"] Dec 16 15:40:32 crc kubenswrapper[4775]: I1216 15:40:32.600420 4775 generic.go:334] "Generic (PLEG): container finished" podID="c0751bbc-1693-4de3-a2c7-d522fee65730" containerID="9c3575f9de27cff02bf4329ade60be87846e8f63fb5d19dbf82015c209549902" exitCode=0 Dec 16 15:40:32 crc kubenswrapper[4775]: I1216 15:40:32.600540 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jmhtx" event={"ID":"c0751bbc-1693-4de3-a2c7-d522fee65730","Type":"ContainerDied","Data":"9c3575f9de27cff02bf4329ade60be87846e8f63fb5d19dbf82015c209549902"} Dec 16 15:40:32 crc kubenswrapper[4775]: I1216 15:40:32.602774 4775 generic.go:334] "Generic (PLEG): container finished" podID="a8ae4b06-fd52-41a4-9111-bfbb4347444f" containerID="686944bc4baac63740b0db9a168fb61961862dcf6e7cf4141a169c2de461f13b" exitCode=0 Dec 16 15:40:32 crc kubenswrapper[4775]: I1216 15:40:32.602813 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8969g" event={"ID":"a8ae4b06-fd52-41a4-9111-bfbb4347444f","Type":"ContainerDied","Data":"686944bc4baac63740b0db9a168fb61961862dcf6e7cf4141a169c2de461f13b"} Dec 16 15:40:32 crc kubenswrapper[4775]: I1216 15:40:32.602837 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8969g" event={"ID":"a8ae4b06-fd52-41a4-9111-bfbb4347444f","Type":"ContainerStarted","Data":"58c9381f6548cda83e507338a41bf1cf9291bba44b71da9108906a29b0348c25"} Dec 16 15:40:33 crc kubenswrapper[4775]: I1216 15:40:33.614162 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jmhtx" event={"ID":"c0751bbc-1693-4de3-a2c7-d522fee65730","Type":"ContainerStarted","Data":"729db8c585f8f124a82ca7d210c46a428c8d497af7e69b45c1e6668c0f3d0455"} Dec 16 15:40:34 crc kubenswrapper[4775]: I1216 15:40:34.624398 4775 generic.go:334] "Generic (PLEG): container finished" podID="a8ae4b06-fd52-41a4-9111-bfbb4347444f" containerID="24154ca6e7fb2f08db2fdef97ff1f387be01b553dce0ea91a31fd3f0b16f14dc" exitCode=0 Dec 16 15:40:34 crc kubenswrapper[4775]: I1216 15:40:34.624466 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8969g" event={"ID":"a8ae4b06-fd52-41a4-9111-bfbb4347444f","Type":"ContainerDied","Data":"24154ca6e7fb2f08db2fdef97ff1f387be01b553dce0ea91a31fd3f0b16f14dc"} Dec 16 15:40:34 crc kubenswrapper[4775]: I1216 15:40:34.652209 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jmhtx" podStartSLOduration=4.108105283 podStartE2EDuration="6.652185932s" podCreationTimestamp="2025-12-16 15:40:28 +0000 UTC" firstStartedPulling="2025-12-16 15:40:30.499750734 +0000 UTC m=+2755.450829657" lastFinishedPulling="2025-12-16 15:40:33.043831373 +0000 UTC m=+2757.994910306" observedRunningTime="2025-12-16 15:40:33.629132024 +0000 UTC m=+2758.580210977" watchObservedRunningTime="2025-12-16 15:40:34.652185932 +0000 UTC m=+2759.603264855" Dec 16 15:40:35 crc kubenswrapper[4775]: I1216 15:40:35.635947 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8969g" event={"ID":"a8ae4b06-fd52-41a4-9111-bfbb4347444f","Type":"ContainerStarted","Data":"d68667531cc7ba01892fb654c33514a903b2ffd19af823c5949c74c6514cdb78"} Dec 16 15:40:35 crc kubenswrapper[4775]: I1216 15:40:35.662136 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8969g" podStartSLOduration=1.9516187299999999 podStartE2EDuration="4.662116442s" podCreationTimestamp="2025-12-16 15:40:31 +0000 UTC" firstStartedPulling="2025-12-16 15:40:32.604293233 +0000 UTC m=+2757.555372146" lastFinishedPulling="2025-12-16 15:40:35.314790935 +0000 UTC m=+2760.265869858" observedRunningTime="2025-12-16 15:40:35.652486876 +0000 UTC m=+2760.603565799" watchObservedRunningTime="2025-12-16 15:40:35.662116442 +0000 UTC m=+2760.613195365" Dec 16 15:40:39 crc kubenswrapper[4775]: I1216 15:40:39.175074 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jmhtx" Dec 16 15:40:39 crc kubenswrapper[4775]: I1216 15:40:39.175404 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jmhtx" Dec 16 15:40:39 crc kubenswrapper[4775]: I1216 15:40:39.233816 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jmhtx" Dec 16 15:40:39 crc kubenswrapper[4775]: I1216 15:40:39.714394 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jmhtx" Dec 16 15:40:40 crc kubenswrapper[4775]: I1216 15:40:40.632536 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jmhtx"] Dec 16 15:40:41 crc kubenswrapper[4775]: I1216 15:40:41.370670 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8969g" Dec 16 15:40:41 crc kubenswrapper[4775]: I1216 15:40:41.370733 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8969g" Dec 16 15:40:41 crc kubenswrapper[4775]: I1216 15:40:41.423590 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8969g" Dec 16 15:40:41 crc kubenswrapper[4775]: I1216 15:40:41.686991 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jmhtx" podUID="c0751bbc-1693-4de3-a2c7-d522fee65730" containerName="registry-server" containerID="cri-o://729db8c585f8f124a82ca7d210c46a428c8d497af7e69b45c1e6668c0f3d0455" gracePeriod=2 Dec 16 15:40:41 crc kubenswrapper[4775]: I1216 15:40:41.734358 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8969g" Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.153914 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jmhtx" Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.241356 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfk6\" (UniqueName: \"kubernetes.io/projected/c0751bbc-1693-4de3-a2c7-d522fee65730-kube-api-access-9xfk6\") pod \"c0751bbc-1693-4de3-a2c7-d522fee65730\" (UID: \"c0751bbc-1693-4de3-a2c7-d522fee65730\") " Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.241517 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0751bbc-1693-4de3-a2c7-d522fee65730-catalog-content\") pod \"c0751bbc-1693-4de3-a2c7-d522fee65730\" (UID: \"c0751bbc-1693-4de3-a2c7-d522fee65730\") " Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.241562 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0751bbc-1693-4de3-a2c7-d522fee65730-utilities\") pod \"c0751bbc-1693-4de3-a2c7-d522fee65730\" (UID: \"c0751bbc-1693-4de3-a2c7-d522fee65730\") " Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.242550 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0751bbc-1693-4de3-a2c7-d522fee65730-utilities" (OuterVolumeSpecName: "utilities") pod "c0751bbc-1693-4de3-a2c7-d522fee65730" (UID: "c0751bbc-1693-4de3-a2c7-d522fee65730"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.247309 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0751bbc-1693-4de3-a2c7-d522fee65730-kube-api-access-9xfk6" (OuterVolumeSpecName: "kube-api-access-9xfk6") pod "c0751bbc-1693-4de3-a2c7-d522fee65730" (UID: "c0751bbc-1693-4de3-a2c7-d522fee65730"). InnerVolumeSpecName "kube-api-access-9xfk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.295522 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0751bbc-1693-4de3-a2c7-d522fee65730-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0751bbc-1693-4de3-a2c7-d522fee65730" (UID: "c0751bbc-1693-4de3-a2c7-d522fee65730"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.343682 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfk6\" (UniqueName: \"kubernetes.io/projected/c0751bbc-1693-4de3-a2c7-d522fee65730-kube-api-access-9xfk6\") on node \"crc\" DevicePath \"\"" Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.343734 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0751bbc-1693-4de3-a2c7-d522fee65730-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.343748 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0751bbc-1693-4de3-a2c7-d522fee65730-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.696546 4775 generic.go:334] "Generic (PLEG): container finished" podID="c0751bbc-1693-4de3-a2c7-d522fee65730" containerID="729db8c585f8f124a82ca7d210c46a428c8d497af7e69b45c1e6668c0f3d0455" exitCode=0 Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.696849 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jmhtx" event={"ID":"c0751bbc-1693-4de3-a2c7-d522fee65730","Type":"ContainerDied","Data":"729db8c585f8f124a82ca7d210c46a428c8d497af7e69b45c1e6668c0f3d0455"} Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.696912 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jmhtx" Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.696934 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jmhtx" event={"ID":"c0751bbc-1693-4de3-a2c7-d522fee65730","Type":"ContainerDied","Data":"3933fb0a84f612b61ed1a50dddcdd39c773cb2bd6a4453fe27bc458f2c06b60e"} Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.696958 4775 scope.go:117] "RemoveContainer" containerID="729db8c585f8f124a82ca7d210c46a428c8d497af7e69b45c1e6668c0f3d0455" Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.718847 4775 scope.go:117] "RemoveContainer" containerID="9c3575f9de27cff02bf4329ade60be87846e8f63fb5d19dbf82015c209549902" Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.748158 4775 scope.go:117] "RemoveContainer" containerID="314914c32f5657c2a0849ddfd05b10edd26775a5c0afd76684735b59a8a47615" Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.755293 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jmhtx"] Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.772246 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jmhtx"] Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.790878 4775 scope.go:117] "RemoveContainer" containerID="729db8c585f8f124a82ca7d210c46a428c8d497af7e69b45c1e6668c0f3d0455" Dec 16 15:40:42 crc kubenswrapper[4775]: E1216 15:40:42.791361 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"729db8c585f8f124a82ca7d210c46a428c8d497af7e69b45c1e6668c0f3d0455\": container with ID starting with 729db8c585f8f124a82ca7d210c46a428c8d497af7e69b45c1e6668c0f3d0455 not found: ID does not exist" containerID="729db8c585f8f124a82ca7d210c46a428c8d497af7e69b45c1e6668c0f3d0455" Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.791410 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"729db8c585f8f124a82ca7d210c46a428c8d497af7e69b45c1e6668c0f3d0455"} err="failed to get container status \"729db8c585f8f124a82ca7d210c46a428c8d497af7e69b45c1e6668c0f3d0455\": rpc error: code = NotFound desc = could not find container \"729db8c585f8f124a82ca7d210c46a428c8d497af7e69b45c1e6668c0f3d0455\": container with ID starting with 729db8c585f8f124a82ca7d210c46a428c8d497af7e69b45c1e6668c0f3d0455 not found: ID does not exist" Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.791439 4775 scope.go:117] "RemoveContainer" containerID="9c3575f9de27cff02bf4329ade60be87846e8f63fb5d19dbf82015c209549902" Dec 16 15:40:42 crc kubenswrapper[4775]: E1216 15:40:42.791959 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c3575f9de27cff02bf4329ade60be87846e8f63fb5d19dbf82015c209549902\": container with ID starting with 9c3575f9de27cff02bf4329ade60be87846e8f63fb5d19dbf82015c209549902 not found: ID does not exist" containerID="9c3575f9de27cff02bf4329ade60be87846e8f63fb5d19dbf82015c209549902" Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.792007 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c3575f9de27cff02bf4329ade60be87846e8f63fb5d19dbf82015c209549902"} err="failed to get container status \"9c3575f9de27cff02bf4329ade60be87846e8f63fb5d19dbf82015c209549902\": rpc error: code = NotFound desc = could not find container \"9c3575f9de27cff02bf4329ade60be87846e8f63fb5d19dbf82015c209549902\": container with ID starting with 9c3575f9de27cff02bf4329ade60be87846e8f63fb5d19dbf82015c209549902 not found: ID does not exist" Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.792041 4775 scope.go:117] "RemoveContainer" containerID="314914c32f5657c2a0849ddfd05b10edd26775a5c0afd76684735b59a8a47615" Dec 16 15:40:42 crc kubenswrapper[4775]: E1216 15:40:42.792359 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"314914c32f5657c2a0849ddfd05b10edd26775a5c0afd76684735b59a8a47615\": container with ID starting with 314914c32f5657c2a0849ddfd05b10edd26775a5c0afd76684735b59a8a47615 not found: ID does not exist" containerID="314914c32f5657c2a0849ddfd05b10edd26775a5c0afd76684735b59a8a47615" Dec 16 15:40:42 crc kubenswrapper[4775]: I1216 15:40:42.792394 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"314914c32f5657c2a0849ddfd05b10edd26775a5c0afd76684735b59a8a47615"} err="failed to get container status \"314914c32f5657c2a0849ddfd05b10edd26775a5c0afd76684735b59a8a47615\": rpc error: code = NotFound desc = could not find container \"314914c32f5657c2a0849ddfd05b10edd26775a5c0afd76684735b59a8a47615\": container with ID starting with 314914c32f5657c2a0849ddfd05b10edd26775a5c0afd76684735b59a8a47615 not found: ID does not exist" Dec 16 15:40:43 crc kubenswrapper[4775]: I1216 15:40:43.349963 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0751bbc-1693-4de3-a2c7-d522fee65730" path="/var/lib/kubelet/pods/c0751bbc-1693-4de3-a2c7-d522fee65730/volumes" Dec 16 15:40:43 crc kubenswrapper[4775]: I1216 15:40:43.834387 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8969g"] Dec 16 15:40:43 crc kubenswrapper[4775]: I1216 15:40:43.834681 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8969g" podUID="a8ae4b06-fd52-41a4-9111-bfbb4347444f" containerName="registry-server" containerID="cri-o://d68667531cc7ba01892fb654c33514a903b2ffd19af823c5949c74c6514cdb78" gracePeriod=2 Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.293588 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8969g" Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.382678 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ae4b06-fd52-41a4-9111-bfbb4347444f-utilities\") pod \"a8ae4b06-fd52-41a4-9111-bfbb4347444f\" (UID: \"a8ae4b06-fd52-41a4-9111-bfbb4347444f\") " Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.382987 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ae4b06-fd52-41a4-9111-bfbb4347444f-catalog-content\") pod \"a8ae4b06-fd52-41a4-9111-bfbb4347444f\" (UID: \"a8ae4b06-fd52-41a4-9111-bfbb4347444f\") " Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.383266 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj24b\" (UniqueName: \"kubernetes.io/projected/a8ae4b06-fd52-41a4-9111-bfbb4347444f-kube-api-access-pj24b\") pod \"a8ae4b06-fd52-41a4-9111-bfbb4347444f\" (UID: \"a8ae4b06-fd52-41a4-9111-bfbb4347444f\") " Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.385201 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8ae4b06-fd52-41a4-9111-bfbb4347444f-utilities" (OuterVolumeSpecName: "utilities") pod "a8ae4b06-fd52-41a4-9111-bfbb4347444f" (UID: "a8ae4b06-fd52-41a4-9111-bfbb4347444f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.389451 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8ae4b06-fd52-41a4-9111-bfbb4347444f-kube-api-access-pj24b" (OuterVolumeSpecName: "kube-api-access-pj24b") pod "a8ae4b06-fd52-41a4-9111-bfbb4347444f" (UID: "a8ae4b06-fd52-41a4-9111-bfbb4347444f"). InnerVolumeSpecName "kube-api-access-pj24b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.405032 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8ae4b06-fd52-41a4-9111-bfbb4347444f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8ae4b06-fd52-41a4-9111-bfbb4347444f" (UID: "a8ae4b06-fd52-41a4-9111-bfbb4347444f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.486157 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ae4b06-fd52-41a4-9111-bfbb4347444f-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.486202 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ae4b06-fd52-41a4-9111-bfbb4347444f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.486219 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj24b\" (UniqueName: \"kubernetes.io/projected/a8ae4b06-fd52-41a4-9111-bfbb4347444f-kube-api-access-pj24b\") on node \"crc\" DevicePath \"\"" Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.724323 4775 generic.go:334] "Generic (PLEG): container finished" podID="a8ae4b06-fd52-41a4-9111-bfbb4347444f" containerID="d68667531cc7ba01892fb654c33514a903b2ffd19af823c5949c74c6514cdb78" exitCode=0 Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.724392 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8969g" event={"ID":"a8ae4b06-fd52-41a4-9111-bfbb4347444f","Type":"ContainerDied","Data":"d68667531cc7ba01892fb654c33514a903b2ffd19af823c5949c74c6514cdb78"} Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.725025 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8969g" event={"ID":"a8ae4b06-fd52-41a4-9111-bfbb4347444f","Type":"ContainerDied","Data":"58c9381f6548cda83e507338a41bf1cf9291bba44b71da9108906a29b0348c25"} Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.725052 4775 scope.go:117] "RemoveContainer" containerID="d68667531cc7ba01892fb654c33514a903b2ffd19af823c5949c74c6514cdb78" Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.724435 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8969g" Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.756745 4775 scope.go:117] "RemoveContainer" containerID="24154ca6e7fb2f08db2fdef97ff1f387be01b553dce0ea91a31fd3f0b16f14dc" Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.767196 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8969g"] Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.778132 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8969g"] Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.804078 4775 scope.go:117] "RemoveContainer" containerID="686944bc4baac63740b0db9a168fb61961862dcf6e7cf4141a169c2de461f13b" Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.827865 4775 scope.go:117] "RemoveContainer" containerID="d68667531cc7ba01892fb654c33514a903b2ffd19af823c5949c74c6514cdb78" Dec 16 15:40:44 crc kubenswrapper[4775]: E1216 15:40:44.828408 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d68667531cc7ba01892fb654c33514a903b2ffd19af823c5949c74c6514cdb78\": container with ID starting with d68667531cc7ba01892fb654c33514a903b2ffd19af823c5949c74c6514cdb78 not found: ID does not exist" containerID="d68667531cc7ba01892fb654c33514a903b2ffd19af823c5949c74c6514cdb78" Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.828452 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d68667531cc7ba01892fb654c33514a903b2ffd19af823c5949c74c6514cdb78"} err="failed to get container status \"d68667531cc7ba01892fb654c33514a903b2ffd19af823c5949c74c6514cdb78\": rpc error: code = NotFound desc = could not find container \"d68667531cc7ba01892fb654c33514a903b2ffd19af823c5949c74c6514cdb78\": container with ID starting with d68667531cc7ba01892fb654c33514a903b2ffd19af823c5949c74c6514cdb78 not found: ID does not exist" Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.828482 4775 scope.go:117] "RemoveContainer" containerID="24154ca6e7fb2f08db2fdef97ff1f387be01b553dce0ea91a31fd3f0b16f14dc" Dec 16 15:40:44 crc kubenswrapper[4775]: E1216 15:40:44.828848 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24154ca6e7fb2f08db2fdef97ff1f387be01b553dce0ea91a31fd3f0b16f14dc\": container with ID starting with 24154ca6e7fb2f08db2fdef97ff1f387be01b553dce0ea91a31fd3f0b16f14dc not found: ID does not exist" containerID="24154ca6e7fb2f08db2fdef97ff1f387be01b553dce0ea91a31fd3f0b16f14dc" Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.828875 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24154ca6e7fb2f08db2fdef97ff1f387be01b553dce0ea91a31fd3f0b16f14dc"} err="failed to get container status \"24154ca6e7fb2f08db2fdef97ff1f387be01b553dce0ea91a31fd3f0b16f14dc\": rpc error: code = NotFound desc = could not find container \"24154ca6e7fb2f08db2fdef97ff1f387be01b553dce0ea91a31fd3f0b16f14dc\": container with ID starting with 24154ca6e7fb2f08db2fdef97ff1f387be01b553dce0ea91a31fd3f0b16f14dc not found: ID does not exist" Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.828903 4775 scope.go:117] "RemoveContainer" containerID="686944bc4baac63740b0db9a168fb61961862dcf6e7cf4141a169c2de461f13b" Dec 16 15:40:44 crc kubenswrapper[4775]: E1216 15:40:44.829164 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"686944bc4baac63740b0db9a168fb61961862dcf6e7cf4141a169c2de461f13b\": container with ID starting with 686944bc4baac63740b0db9a168fb61961862dcf6e7cf4141a169c2de461f13b not found: ID does not exist" containerID="686944bc4baac63740b0db9a168fb61961862dcf6e7cf4141a169c2de461f13b" Dec 16 15:40:44 crc kubenswrapper[4775]: I1216 15:40:44.829207 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"686944bc4baac63740b0db9a168fb61961862dcf6e7cf4141a169c2de461f13b"} err="failed to get container status \"686944bc4baac63740b0db9a168fb61961862dcf6e7cf4141a169c2de461f13b\": rpc error: code = NotFound desc = could not find container \"686944bc4baac63740b0db9a168fb61961862dcf6e7cf4141a169c2de461f13b\": container with ID starting with 686944bc4baac63740b0db9a168fb61961862dcf6e7cf4141a169c2de461f13b not found: ID does not exist" Dec 16 15:40:45 crc kubenswrapper[4775]: I1216 15:40:45.348754 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8ae4b06-fd52-41a4-9111-bfbb4347444f" path="/var/lib/kubelet/pods/a8ae4b06-fd52-41a4-9111-bfbb4347444f/volumes" Dec 16 15:41:34 crc kubenswrapper[4775]: I1216 15:41:34.340341 4775 generic.go:334] "Generic (PLEG): container finished" podID="533cc620-42ce-4262-bcfe-25c8ebe74ff6" containerID="c59ca338db88c7f31220e819463c2205e4774f0aff112d61ca23d8705170035f" exitCode=0 Dec 16 15:41:34 crc kubenswrapper[4775]: I1216 15:41:34.340413 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" event={"ID":"533cc620-42ce-4262-bcfe-25c8ebe74ff6","Type":"ContainerDied","Data":"c59ca338db88c7f31220e819463c2205e4774f0aff112d61ca23d8705170035f"} Dec 16 15:41:35 crc kubenswrapper[4775]: I1216 15:41:35.833836 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:41:35 crc kubenswrapper[4775]: I1216 15:41:35.961413 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ceilometer-compute-config-data-2\") pod \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " Dec 16 15:41:35 crc kubenswrapper[4775]: I1216 15:41:35.961471 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ceilometer-compute-config-data-0\") pod \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " Dec 16 15:41:35 crc kubenswrapper[4775]: I1216 15:41:35.961504 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpgnr\" (UniqueName: \"kubernetes.io/projected/533cc620-42ce-4262-bcfe-25c8ebe74ff6-kube-api-access-jpgnr\") pod \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " Dec 16 15:41:35 crc kubenswrapper[4775]: I1216 15:41:35.961575 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ssh-key\") pod \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " Dec 16 15:41:35 crc kubenswrapper[4775]: I1216 15:41:35.961607 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-telemetry-combined-ca-bundle\") pod \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " Dec 16 15:41:35 crc kubenswrapper[4775]: I1216 15:41:35.961661 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-inventory\") pod \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " Dec 16 15:41:35 crc kubenswrapper[4775]: I1216 15:41:35.961722 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ceilometer-compute-config-data-1\") pod \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\" (UID: \"533cc620-42ce-4262-bcfe-25c8ebe74ff6\") " Dec 16 15:41:35 crc kubenswrapper[4775]: I1216 15:41:35.970074 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/533cc620-42ce-4262-bcfe-25c8ebe74ff6-kube-api-access-jpgnr" (OuterVolumeSpecName: "kube-api-access-jpgnr") pod "533cc620-42ce-4262-bcfe-25c8ebe74ff6" (UID: "533cc620-42ce-4262-bcfe-25c8ebe74ff6"). InnerVolumeSpecName "kube-api-access-jpgnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:41:35 crc kubenswrapper[4775]: I1216 15:41:35.975822 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "533cc620-42ce-4262-bcfe-25c8ebe74ff6" (UID: "533cc620-42ce-4262-bcfe-25c8ebe74ff6"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:41:35 crc kubenswrapper[4775]: I1216 15:41:35.990304 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "533cc620-42ce-4262-bcfe-25c8ebe74ff6" (UID: "533cc620-42ce-4262-bcfe-25c8ebe74ff6"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:41:35 crc kubenswrapper[4775]: I1216 15:41:35.991272 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "533cc620-42ce-4262-bcfe-25c8ebe74ff6" (UID: "533cc620-42ce-4262-bcfe-25c8ebe74ff6"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:41:35 crc kubenswrapper[4775]: I1216 15:41:35.995006 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "533cc620-42ce-4262-bcfe-25c8ebe74ff6" (UID: "533cc620-42ce-4262-bcfe-25c8ebe74ff6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:41:35 crc kubenswrapper[4775]: I1216 15:41:35.997512 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-inventory" (OuterVolumeSpecName: "inventory") pod "533cc620-42ce-4262-bcfe-25c8ebe74ff6" (UID: "533cc620-42ce-4262-bcfe-25c8ebe74ff6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:41:36 crc kubenswrapper[4775]: I1216 15:41:36.003765 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "533cc620-42ce-4262-bcfe-25c8ebe74ff6" (UID: "533cc620-42ce-4262-bcfe-25c8ebe74ff6"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:41:36 crc kubenswrapper[4775]: I1216 15:41:36.064526 4775 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 16 15:41:36 crc kubenswrapper[4775]: I1216 15:41:36.064713 4775 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:41:36 crc kubenswrapper[4775]: I1216 15:41:36.064731 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpgnr\" (UniqueName: \"kubernetes.io/projected/533cc620-42ce-4262-bcfe-25c8ebe74ff6-kube-api-access-jpgnr\") on node \"crc\" DevicePath \"\"" Dec 16 15:41:36 crc kubenswrapper[4775]: I1216 15:41:36.064745 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:41:36 crc kubenswrapper[4775]: I1216 15:41:36.064755 4775 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:41:36 crc kubenswrapper[4775]: I1216 15:41:36.064765 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:41:36 crc kubenswrapper[4775]: I1216 15:41:36.064774 4775 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/533cc620-42ce-4262-bcfe-25c8ebe74ff6-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 16 15:41:36 crc kubenswrapper[4775]: I1216 15:41:36.359872 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" event={"ID":"533cc620-42ce-4262-bcfe-25c8ebe74ff6","Type":"ContainerDied","Data":"7c70fb48387bcc31b16d66356f7ca3d403a9d9822fa9b0614c4cf2f2e9074da7"} Dec 16 15:41:36 crc kubenswrapper[4775]: I1216 15:41:36.360199 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c70fb48387bcc31b16d66356f7ca3d403a9d9822fa9b0614c4cf2f2e9074da7" Dec 16 15:41:36 crc kubenswrapper[4775]: I1216 15:41:36.360014 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9n929" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.311861 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8lcsg"] Dec 16 15:41:43 crc kubenswrapper[4775]: E1216 15:41:43.312883 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0751bbc-1693-4de3-a2c7-d522fee65730" containerName="registry-server" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.313315 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0751bbc-1693-4de3-a2c7-d522fee65730" containerName="registry-server" Dec 16 15:41:43 crc kubenswrapper[4775]: E1216 15:41:43.313343 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ae4b06-fd52-41a4-9111-bfbb4347444f" containerName="extract-content" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.313350 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ae4b06-fd52-41a4-9111-bfbb4347444f" containerName="extract-content" Dec 16 15:41:43 crc kubenswrapper[4775]: E1216 15:41:43.313364 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ae4b06-fd52-41a4-9111-bfbb4347444f" containerName="extract-utilities" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.313373 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ae4b06-fd52-41a4-9111-bfbb4347444f" containerName="extract-utilities" Dec 16 15:41:43 crc kubenswrapper[4775]: E1216 15:41:43.313384 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ae4b06-fd52-41a4-9111-bfbb4347444f" containerName="registry-server" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.313392 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ae4b06-fd52-41a4-9111-bfbb4347444f" containerName="registry-server" Dec 16 15:41:43 crc kubenswrapper[4775]: E1216 15:41:43.313423 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="533cc620-42ce-4262-bcfe-25c8ebe74ff6" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.313433 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="533cc620-42ce-4262-bcfe-25c8ebe74ff6" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 16 15:41:43 crc kubenswrapper[4775]: E1216 15:41:43.313458 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0751bbc-1693-4de3-a2c7-d522fee65730" containerName="extract-content" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.313467 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0751bbc-1693-4de3-a2c7-d522fee65730" containerName="extract-content" Dec 16 15:41:43 crc kubenswrapper[4775]: E1216 15:41:43.313481 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0751bbc-1693-4de3-a2c7-d522fee65730" containerName="extract-utilities" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.313488 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0751bbc-1693-4de3-a2c7-d522fee65730" containerName="extract-utilities" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.313707 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8ae4b06-fd52-41a4-9111-bfbb4347444f" containerName="registry-server" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.313724 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="533cc620-42ce-4262-bcfe-25c8ebe74ff6" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.313743 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0751bbc-1693-4de3-a2c7-d522fee65730" containerName="registry-server" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.315436 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8lcsg" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.336489 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8lcsg"] Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.509811 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5284b5e8-c10c-4060-bc41-9a828df8b29d-catalog-content\") pod \"community-operators-8lcsg\" (UID: \"5284b5e8-c10c-4060-bc41-9a828df8b29d\") " pod="openshift-marketplace/community-operators-8lcsg" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.509881 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-592q6\" (UniqueName: \"kubernetes.io/projected/5284b5e8-c10c-4060-bc41-9a828df8b29d-kube-api-access-592q6\") pod \"community-operators-8lcsg\" (UID: \"5284b5e8-c10c-4060-bc41-9a828df8b29d\") " pod="openshift-marketplace/community-operators-8lcsg" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.510007 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5284b5e8-c10c-4060-bc41-9a828df8b29d-utilities\") pod \"community-operators-8lcsg\" (UID: \"5284b5e8-c10c-4060-bc41-9a828df8b29d\") " pod="openshift-marketplace/community-operators-8lcsg" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.612010 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5284b5e8-c10c-4060-bc41-9a828df8b29d-catalog-content\") pod \"community-operators-8lcsg\" (UID: \"5284b5e8-c10c-4060-bc41-9a828df8b29d\") " pod="openshift-marketplace/community-operators-8lcsg" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.612072 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-592q6\" (UniqueName: \"kubernetes.io/projected/5284b5e8-c10c-4060-bc41-9a828df8b29d-kube-api-access-592q6\") pod \"community-operators-8lcsg\" (UID: \"5284b5e8-c10c-4060-bc41-9a828df8b29d\") " pod="openshift-marketplace/community-operators-8lcsg" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.612156 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5284b5e8-c10c-4060-bc41-9a828df8b29d-utilities\") pod \"community-operators-8lcsg\" (UID: \"5284b5e8-c10c-4060-bc41-9a828df8b29d\") " pod="openshift-marketplace/community-operators-8lcsg" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.612627 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5284b5e8-c10c-4060-bc41-9a828df8b29d-utilities\") pod \"community-operators-8lcsg\" (UID: \"5284b5e8-c10c-4060-bc41-9a828df8b29d\") " pod="openshift-marketplace/community-operators-8lcsg" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.612861 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5284b5e8-c10c-4060-bc41-9a828df8b29d-catalog-content\") pod \"community-operators-8lcsg\" (UID: \"5284b5e8-c10c-4060-bc41-9a828df8b29d\") " pod="openshift-marketplace/community-operators-8lcsg" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.638685 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-592q6\" (UniqueName: \"kubernetes.io/projected/5284b5e8-c10c-4060-bc41-9a828df8b29d-kube-api-access-592q6\") pod \"community-operators-8lcsg\" (UID: \"5284b5e8-c10c-4060-bc41-9a828df8b29d\") " pod="openshift-marketplace/community-operators-8lcsg" Dec 16 15:41:43 crc kubenswrapper[4775]: I1216 15:41:43.644448 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8lcsg" Dec 16 15:41:44 crc kubenswrapper[4775]: I1216 15:41:44.166909 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8lcsg"] Dec 16 15:41:44 crc kubenswrapper[4775]: I1216 15:41:44.427295 4775 generic.go:334] "Generic (PLEG): container finished" podID="5284b5e8-c10c-4060-bc41-9a828df8b29d" containerID="5863b513377e91555f65df597a21e72786f1323b63b11119311dfcadc8e49d35" exitCode=0 Dec 16 15:41:44 crc kubenswrapper[4775]: I1216 15:41:44.427369 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lcsg" event={"ID":"5284b5e8-c10c-4060-bc41-9a828df8b29d","Type":"ContainerDied","Data":"5863b513377e91555f65df597a21e72786f1323b63b11119311dfcadc8e49d35"} Dec 16 15:41:44 crc kubenswrapper[4775]: I1216 15:41:44.427408 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lcsg" event={"ID":"5284b5e8-c10c-4060-bc41-9a828df8b29d","Type":"ContainerStarted","Data":"75d3ca0ec2774cde91150c4c4e9542ddfc4a13558259b2808494608059ce4c76"} Dec 16 15:41:44 crc kubenswrapper[4775]: I1216 15:41:44.430055 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 15:41:45 crc kubenswrapper[4775]: I1216 15:41:45.437428 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lcsg" event={"ID":"5284b5e8-c10c-4060-bc41-9a828df8b29d","Type":"ContainerStarted","Data":"ea4b61950d9c7587f26393616f3a57b29f28c05ac4b6fc7becfebb6f586118bb"} Dec 16 15:41:46 crc kubenswrapper[4775]: I1216 15:41:46.446968 4775 generic.go:334] "Generic (PLEG): container finished" podID="5284b5e8-c10c-4060-bc41-9a828df8b29d" containerID="ea4b61950d9c7587f26393616f3a57b29f28c05ac4b6fc7becfebb6f586118bb" exitCode=0 Dec 16 15:41:46 crc kubenswrapper[4775]: I1216 15:41:46.447028 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lcsg" event={"ID":"5284b5e8-c10c-4060-bc41-9a828df8b29d","Type":"ContainerDied","Data":"ea4b61950d9c7587f26393616f3a57b29f28c05ac4b6fc7becfebb6f586118bb"} Dec 16 15:41:47 crc kubenswrapper[4775]: I1216 15:41:47.460391 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lcsg" event={"ID":"5284b5e8-c10c-4060-bc41-9a828df8b29d","Type":"ContainerStarted","Data":"1dfb4510c3f0b6e10a433e5fa4eed339596b553c430cf8103c26dc3175988002"} Dec 16 15:41:47 crc kubenswrapper[4775]: I1216 15:41:47.478992 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8lcsg" podStartSLOduration=1.962990367 podStartE2EDuration="4.478970149s" podCreationTimestamp="2025-12-16 15:41:43 +0000 UTC" firstStartedPulling="2025-12-16 15:41:44.429835125 +0000 UTC m=+2829.380914048" lastFinishedPulling="2025-12-16 15:41:46.945814907 +0000 UTC m=+2831.896893830" observedRunningTime="2025-12-16 15:41:47.47806996 +0000 UTC m=+2832.429148913" watchObservedRunningTime="2025-12-16 15:41:47.478970149 +0000 UTC m=+2832.430049102" Dec 16 15:41:53 crc kubenswrapper[4775]: I1216 15:41:53.646295 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8lcsg" Dec 16 15:41:53 crc kubenswrapper[4775]: I1216 15:41:53.646801 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8lcsg" Dec 16 15:41:53 crc kubenswrapper[4775]: I1216 15:41:53.713244 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8lcsg" Dec 16 15:41:54 crc kubenswrapper[4775]: I1216 15:41:54.570866 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8lcsg" Dec 16 15:41:54 crc kubenswrapper[4775]: I1216 15:41:54.626670 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8lcsg"] Dec 16 15:41:56 crc kubenswrapper[4775]: I1216 15:41:56.540789 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8lcsg" podUID="5284b5e8-c10c-4060-bc41-9a828df8b29d" containerName="registry-server" containerID="cri-o://1dfb4510c3f0b6e10a433e5fa4eed339596b553c430cf8103c26dc3175988002" gracePeriod=2 Dec 16 15:41:56 crc kubenswrapper[4775]: I1216 15:41:56.984648 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8lcsg" Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.173344 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5284b5e8-c10c-4060-bc41-9a828df8b29d-utilities\") pod \"5284b5e8-c10c-4060-bc41-9a828df8b29d\" (UID: \"5284b5e8-c10c-4060-bc41-9a828df8b29d\") " Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.173403 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5284b5e8-c10c-4060-bc41-9a828df8b29d-catalog-content\") pod \"5284b5e8-c10c-4060-bc41-9a828df8b29d\" (UID: \"5284b5e8-c10c-4060-bc41-9a828df8b29d\") " Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.173510 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-592q6\" (UniqueName: \"kubernetes.io/projected/5284b5e8-c10c-4060-bc41-9a828df8b29d-kube-api-access-592q6\") pod \"5284b5e8-c10c-4060-bc41-9a828df8b29d\" (UID: \"5284b5e8-c10c-4060-bc41-9a828df8b29d\") " Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.174685 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5284b5e8-c10c-4060-bc41-9a828df8b29d-utilities" (OuterVolumeSpecName: "utilities") pod "5284b5e8-c10c-4060-bc41-9a828df8b29d" (UID: "5284b5e8-c10c-4060-bc41-9a828df8b29d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.180018 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5284b5e8-c10c-4060-bc41-9a828df8b29d-kube-api-access-592q6" (OuterVolumeSpecName: "kube-api-access-592q6") pod "5284b5e8-c10c-4060-bc41-9a828df8b29d" (UID: "5284b5e8-c10c-4060-bc41-9a828df8b29d"). InnerVolumeSpecName "kube-api-access-592q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.237878 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5284b5e8-c10c-4060-bc41-9a828df8b29d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5284b5e8-c10c-4060-bc41-9a828df8b29d" (UID: "5284b5e8-c10c-4060-bc41-9a828df8b29d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.276222 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5284b5e8-c10c-4060-bc41-9a828df8b29d-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.276272 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5284b5e8-c10c-4060-bc41-9a828df8b29d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.276289 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-592q6\" (UniqueName: \"kubernetes.io/projected/5284b5e8-c10c-4060-bc41-9a828df8b29d-kube-api-access-592q6\") on node \"crc\" DevicePath \"\"" Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.554692 4775 generic.go:334] "Generic (PLEG): container finished" podID="5284b5e8-c10c-4060-bc41-9a828df8b29d" containerID="1dfb4510c3f0b6e10a433e5fa4eed339596b553c430cf8103c26dc3175988002" exitCode=0 Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.554769 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lcsg" event={"ID":"5284b5e8-c10c-4060-bc41-9a828df8b29d","Type":"ContainerDied","Data":"1dfb4510c3f0b6e10a433e5fa4eed339596b553c430cf8103c26dc3175988002"} Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.554789 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8lcsg" Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.554810 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lcsg" event={"ID":"5284b5e8-c10c-4060-bc41-9a828df8b29d","Type":"ContainerDied","Data":"75d3ca0ec2774cde91150c4c4e9542ddfc4a13558259b2808494608059ce4c76"} Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.554835 4775 scope.go:117] "RemoveContainer" containerID="1dfb4510c3f0b6e10a433e5fa4eed339596b553c430cf8103c26dc3175988002" Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.590899 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8lcsg"] Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.595464 4775 scope.go:117] "RemoveContainer" containerID="ea4b61950d9c7587f26393616f3a57b29f28c05ac4b6fc7becfebb6f586118bb" Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.601656 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8lcsg"] Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.619011 4775 scope.go:117] "RemoveContainer" containerID="5863b513377e91555f65df597a21e72786f1323b63b11119311dfcadc8e49d35" Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.658317 4775 scope.go:117] "RemoveContainer" containerID="1dfb4510c3f0b6e10a433e5fa4eed339596b553c430cf8103c26dc3175988002" Dec 16 15:41:57 crc kubenswrapper[4775]: E1216 15:41:57.658858 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dfb4510c3f0b6e10a433e5fa4eed339596b553c430cf8103c26dc3175988002\": container with ID starting with 1dfb4510c3f0b6e10a433e5fa4eed339596b553c430cf8103c26dc3175988002 not found: ID does not exist" containerID="1dfb4510c3f0b6e10a433e5fa4eed339596b553c430cf8103c26dc3175988002" Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.658923 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dfb4510c3f0b6e10a433e5fa4eed339596b553c430cf8103c26dc3175988002"} err="failed to get container status \"1dfb4510c3f0b6e10a433e5fa4eed339596b553c430cf8103c26dc3175988002\": rpc error: code = NotFound desc = could not find container \"1dfb4510c3f0b6e10a433e5fa4eed339596b553c430cf8103c26dc3175988002\": container with ID starting with 1dfb4510c3f0b6e10a433e5fa4eed339596b553c430cf8103c26dc3175988002 not found: ID does not exist" Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.658955 4775 scope.go:117] "RemoveContainer" containerID="ea4b61950d9c7587f26393616f3a57b29f28c05ac4b6fc7becfebb6f586118bb" Dec 16 15:41:57 crc kubenswrapper[4775]: E1216 15:41:57.659264 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea4b61950d9c7587f26393616f3a57b29f28c05ac4b6fc7becfebb6f586118bb\": container with ID starting with ea4b61950d9c7587f26393616f3a57b29f28c05ac4b6fc7becfebb6f586118bb not found: ID does not exist" containerID="ea4b61950d9c7587f26393616f3a57b29f28c05ac4b6fc7becfebb6f586118bb" Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.659289 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea4b61950d9c7587f26393616f3a57b29f28c05ac4b6fc7becfebb6f586118bb"} err="failed to get container status \"ea4b61950d9c7587f26393616f3a57b29f28c05ac4b6fc7becfebb6f586118bb\": rpc error: code = NotFound desc = could not find container \"ea4b61950d9c7587f26393616f3a57b29f28c05ac4b6fc7becfebb6f586118bb\": container with ID starting with ea4b61950d9c7587f26393616f3a57b29f28c05ac4b6fc7becfebb6f586118bb not found: ID does not exist" Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.659305 4775 scope.go:117] "RemoveContainer" containerID="5863b513377e91555f65df597a21e72786f1323b63b11119311dfcadc8e49d35" Dec 16 15:41:57 crc kubenswrapper[4775]: E1216 15:41:57.659586 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5863b513377e91555f65df597a21e72786f1323b63b11119311dfcadc8e49d35\": container with ID starting with 5863b513377e91555f65df597a21e72786f1323b63b11119311dfcadc8e49d35 not found: ID does not exist" containerID="5863b513377e91555f65df597a21e72786f1323b63b11119311dfcadc8e49d35" Dec 16 15:41:57 crc kubenswrapper[4775]: I1216 15:41:57.659644 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5863b513377e91555f65df597a21e72786f1323b63b11119311dfcadc8e49d35"} err="failed to get container status \"5863b513377e91555f65df597a21e72786f1323b63b11119311dfcadc8e49d35\": rpc error: code = NotFound desc = could not find container \"5863b513377e91555f65df597a21e72786f1323b63b11119311dfcadc8e49d35\": container with ID starting with 5863b513377e91555f65df597a21e72786f1323b63b11119311dfcadc8e49d35 not found: ID does not exist" Dec 16 15:41:59 crc kubenswrapper[4775]: I1216 15:41:59.362737 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5284b5e8-c10c-4060-bc41-9a828df8b29d" path="/var/lib/kubelet/pods/5284b5e8-c10c-4060-bc41-9a828df8b29d/volumes" Dec 16 15:42:02 crc kubenswrapper[4775]: I1216 15:42:02.869615 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:42:02 crc kubenswrapper[4775]: I1216 15:42:02.869991 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:42:17 crc kubenswrapper[4775]: E1216 15:42:17.170434 4775 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.223:55284->38.102.83.223:37315: write tcp 38.102.83.223:55284->38.102.83.223:37315: write: broken pipe Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.768392 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 16 15:42:30 crc kubenswrapper[4775]: E1216 15:42:30.769810 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5284b5e8-c10c-4060-bc41-9a828df8b29d" containerName="extract-utilities" Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.769837 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5284b5e8-c10c-4060-bc41-9a828df8b29d" containerName="extract-utilities" Dec 16 15:42:30 crc kubenswrapper[4775]: E1216 15:42:30.769936 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5284b5e8-c10c-4060-bc41-9a828df8b29d" containerName="extract-content" Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.769963 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5284b5e8-c10c-4060-bc41-9a828df8b29d" containerName="extract-content" Dec 16 15:42:30 crc kubenswrapper[4775]: E1216 15:42:30.770003 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5284b5e8-c10c-4060-bc41-9a828df8b29d" containerName="registry-server" Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.770019 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5284b5e8-c10c-4060-bc41-9a828df8b29d" containerName="registry-server" Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.770370 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5284b5e8-c10c-4060-bc41-9a828df8b29d" containerName="registry-server" Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.771374 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.775204 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.775670 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-q28vx" Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.777497 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.779358 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.787398 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.928588 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81e92dde-6675-4a19-a619-52358e91c49c-config-data\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.928652 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/81e92dde-6675-4a19-a619-52358e91c49c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.928690 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwgrh\" (UniqueName: \"kubernetes.io/projected/81e92dde-6675-4a19-a619-52358e91c49c-kube-api-access-bwgrh\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.928713 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/81e92dde-6675-4a19-a619-52358e91c49c-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.928756 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/81e92dde-6675-4a19-a619-52358e91c49c-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.928811 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/81e92dde-6675-4a19-a619-52358e91c49c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.928869 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/81e92dde-6675-4a19-a619-52358e91c49c-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.928935 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81e92dde-6675-4a19-a619-52358e91c49c-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:30 crc kubenswrapper[4775]: I1216 15:42:30.929235 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.031353 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81e92dde-6675-4a19-a619-52358e91c49c-config-data\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.031419 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/81e92dde-6675-4a19-a619-52358e91c49c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.031462 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwgrh\" (UniqueName: \"kubernetes.io/projected/81e92dde-6675-4a19-a619-52358e91c49c-kube-api-access-bwgrh\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.031486 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/81e92dde-6675-4a19-a619-52358e91c49c-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.031528 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/81e92dde-6675-4a19-a619-52358e91c49c-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.031550 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/81e92dde-6675-4a19-a619-52358e91c49c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.031585 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/81e92dde-6675-4a19-a619-52358e91c49c-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.031631 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81e92dde-6675-4a19-a619-52358e91c49c-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.031703 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.032458 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.032852 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/81e92dde-6675-4a19-a619-52358e91c49c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.033038 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/81e92dde-6675-4a19-a619-52358e91c49c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.033419 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81e92dde-6675-4a19-a619-52358e91c49c-config-data\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.033538 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/81e92dde-6675-4a19-a619-52358e91c49c-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.041376 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81e92dde-6675-4a19-a619-52358e91c49c-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.041580 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/81e92dde-6675-4a19-a619-52358e91c49c-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.052717 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/81e92dde-6675-4a19-a619-52358e91c49c-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.060501 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwgrh\" (UniqueName: \"kubernetes.io/projected/81e92dde-6675-4a19-a619-52358e91c49c-kube-api-access-bwgrh\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.071520 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.109735 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.696748 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 16 15:42:31 crc kubenswrapper[4775]: I1216 15:42:31.870872 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"81e92dde-6675-4a19-a619-52358e91c49c","Type":"ContainerStarted","Data":"0a918477c1e177fb308656593047d223bbfa10d44c07e47d2b5c39f5bf151685"} Dec 16 15:42:32 crc kubenswrapper[4775]: I1216 15:42:32.868995 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:42:32 crc kubenswrapper[4775]: I1216 15:42:32.869092 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:42:58 crc kubenswrapper[4775]: E1216 15:42:58.311646 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 16 15:42:58 crc kubenswrapper[4775]: E1216 15:42:58.312876 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bwgrh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(81e92dde-6675-4a19-a619-52358e91c49c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:42:58 crc kubenswrapper[4775]: E1216 15:42:58.314264 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="81e92dde-6675-4a19-a619-52358e91c49c" Dec 16 15:42:59 crc kubenswrapper[4775]: E1216 15:42:59.146722 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="81e92dde-6675-4a19-a619-52358e91c49c" Dec 16 15:43:02 crc kubenswrapper[4775]: I1216 15:43:02.868822 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:43:02 crc kubenswrapper[4775]: I1216 15:43:02.869360 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:43:02 crc kubenswrapper[4775]: I1216 15:43:02.869432 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 15:43:02 crc kubenswrapper[4775]: I1216 15:43:02.870406 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4729b3904585254169a82d5de79bd940893d6ca623bcb8a4ed43c5b86e8831f4"} pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:43:02 crc kubenswrapper[4775]: I1216 15:43:02.870503 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" containerID="cri-o://4729b3904585254169a82d5de79bd940893d6ca623bcb8a4ed43c5b86e8831f4" gracePeriod=600 Dec 16 15:43:03 crc kubenswrapper[4775]: I1216 15:43:03.194595 4775 generic.go:334] "Generic (PLEG): container finished" podID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerID="4729b3904585254169a82d5de79bd940893d6ca623bcb8a4ed43c5b86e8831f4" exitCode=0 Dec 16 15:43:03 crc kubenswrapper[4775]: I1216 15:43:03.194647 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerDied","Data":"4729b3904585254169a82d5de79bd940893d6ca623bcb8a4ed43c5b86e8831f4"} Dec 16 15:43:03 crc kubenswrapper[4775]: I1216 15:43:03.195006 4775 scope.go:117] "RemoveContainer" containerID="cedc9e318cf1faf91b9aca3a062a7a600c284913569da01f9af7a4edc4adcb26" Dec 16 15:43:04 crc kubenswrapper[4775]: I1216 15:43:04.208829 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerStarted","Data":"7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6"} Dec 16 15:43:11 crc kubenswrapper[4775]: I1216 15:43:11.793078 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 16 15:43:13 crc kubenswrapper[4775]: I1216 15:43:13.280642 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"81e92dde-6675-4a19-a619-52358e91c49c","Type":"ContainerStarted","Data":"4ba8033fe6f5ca4d275e4f546e31ae7d6fd3f7e1ac0b78d8563ff579820a2be9"} Dec 16 15:43:13 crc kubenswrapper[4775]: I1216 15:43:13.307093 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.2238494 podStartE2EDuration="44.307072945s" podCreationTimestamp="2025-12-16 15:42:29 +0000 UTC" firstStartedPulling="2025-12-16 15:42:31.707402068 +0000 UTC m=+2876.658480991" lastFinishedPulling="2025-12-16 15:43:11.790625613 +0000 UTC m=+2916.741704536" observedRunningTime="2025-12-16 15:43:13.298125946 +0000 UTC m=+2918.249204879" watchObservedRunningTime="2025-12-16 15:43:13.307072945 +0000 UTC m=+2918.258151858" Dec 16 15:45:00 crc kubenswrapper[4775]: I1216 15:45:00.155763 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd"] Dec 16 15:45:00 crc kubenswrapper[4775]: I1216 15:45:00.159096 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd" Dec 16 15:45:00 crc kubenswrapper[4775]: I1216 15:45:00.162169 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 15:45:00 crc kubenswrapper[4775]: I1216 15:45:00.162365 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 15:45:00 crc kubenswrapper[4775]: I1216 15:45:00.165866 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd"] Dec 16 15:45:00 crc kubenswrapper[4775]: I1216 15:45:00.219097 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6phn\" (UniqueName: \"kubernetes.io/projected/c9fce0cc-fd15-485f-9b1b-af84f534d797-kube-api-access-s6phn\") pod \"collect-profiles-29431665-p92bd\" (UID: \"c9fce0cc-fd15-485f-9b1b-af84f534d797\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd" Dec 16 15:45:00 crc kubenswrapper[4775]: I1216 15:45:00.219408 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9fce0cc-fd15-485f-9b1b-af84f534d797-config-volume\") pod \"collect-profiles-29431665-p92bd\" (UID: \"c9fce0cc-fd15-485f-9b1b-af84f534d797\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd" Dec 16 15:45:00 crc kubenswrapper[4775]: I1216 15:45:00.219481 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9fce0cc-fd15-485f-9b1b-af84f534d797-secret-volume\") pod \"collect-profiles-29431665-p92bd\" (UID: \"c9fce0cc-fd15-485f-9b1b-af84f534d797\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd" Dec 16 15:45:00 crc kubenswrapper[4775]: I1216 15:45:00.321915 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6phn\" (UniqueName: \"kubernetes.io/projected/c9fce0cc-fd15-485f-9b1b-af84f534d797-kube-api-access-s6phn\") pod \"collect-profiles-29431665-p92bd\" (UID: \"c9fce0cc-fd15-485f-9b1b-af84f534d797\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd" Dec 16 15:45:00 crc kubenswrapper[4775]: I1216 15:45:00.322035 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9fce0cc-fd15-485f-9b1b-af84f534d797-config-volume\") pod \"collect-profiles-29431665-p92bd\" (UID: \"c9fce0cc-fd15-485f-9b1b-af84f534d797\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd" Dec 16 15:45:00 crc kubenswrapper[4775]: I1216 15:45:00.322067 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9fce0cc-fd15-485f-9b1b-af84f534d797-secret-volume\") pod \"collect-profiles-29431665-p92bd\" (UID: \"c9fce0cc-fd15-485f-9b1b-af84f534d797\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd" Dec 16 15:45:00 crc kubenswrapper[4775]: I1216 15:45:00.323104 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9fce0cc-fd15-485f-9b1b-af84f534d797-config-volume\") pod \"collect-profiles-29431665-p92bd\" (UID: \"c9fce0cc-fd15-485f-9b1b-af84f534d797\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd" Dec 16 15:45:00 crc kubenswrapper[4775]: I1216 15:45:00.328073 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9fce0cc-fd15-485f-9b1b-af84f534d797-secret-volume\") pod \"collect-profiles-29431665-p92bd\" (UID: \"c9fce0cc-fd15-485f-9b1b-af84f534d797\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd" Dec 16 15:45:00 crc kubenswrapper[4775]: I1216 15:45:00.372796 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6phn\" (UniqueName: \"kubernetes.io/projected/c9fce0cc-fd15-485f-9b1b-af84f534d797-kube-api-access-s6phn\") pod \"collect-profiles-29431665-p92bd\" (UID: \"c9fce0cc-fd15-485f-9b1b-af84f534d797\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd" Dec 16 15:45:00 crc kubenswrapper[4775]: I1216 15:45:00.482617 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd" Dec 16 15:45:00 crc kubenswrapper[4775]: I1216 15:45:00.975269 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd"] Dec 16 15:45:01 crc kubenswrapper[4775]: I1216 15:45:01.318400 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd" event={"ID":"c9fce0cc-fd15-485f-9b1b-af84f534d797","Type":"ContainerStarted","Data":"60c8ab2faafd96c9d67d847fde8516de07e10767ec09d67d615445deb706985d"} Dec 16 15:45:01 crc kubenswrapper[4775]: I1216 15:45:01.318734 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd" event={"ID":"c9fce0cc-fd15-485f-9b1b-af84f534d797","Type":"ContainerStarted","Data":"68e3880f668faaa2356f3f36f5eebdf37b27823585f52770e6603b35bbf8a684"} Dec 16 15:45:01 crc kubenswrapper[4775]: I1216 15:45:01.343171 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd" podStartSLOduration=1.3431546349999999 podStartE2EDuration="1.343154635s" podCreationTimestamp="2025-12-16 15:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:45:01.342390481 +0000 UTC m=+3026.293469404" watchObservedRunningTime="2025-12-16 15:45:01.343154635 +0000 UTC m=+3026.294233558" Dec 16 15:45:02 crc kubenswrapper[4775]: I1216 15:45:02.330143 4775 generic.go:334] "Generic (PLEG): container finished" podID="c9fce0cc-fd15-485f-9b1b-af84f534d797" containerID="60c8ab2faafd96c9d67d847fde8516de07e10767ec09d67d615445deb706985d" exitCode=0 Dec 16 15:45:02 crc kubenswrapper[4775]: I1216 15:45:02.330236 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd" event={"ID":"c9fce0cc-fd15-485f-9b1b-af84f534d797","Type":"ContainerDied","Data":"60c8ab2faafd96c9d67d847fde8516de07e10767ec09d67d615445deb706985d"} Dec 16 15:45:03 crc kubenswrapper[4775]: I1216 15:45:03.843075 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd" Dec 16 15:45:03 crc kubenswrapper[4775]: I1216 15:45:03.988215 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9fce0cc-fd15-485f-9b1b-af84f534d797-secret-volume\") pod \"c9fce0cc-fd15-485f-9b1b-af84f534d797\" (UID: \"c9fce0cc-fd15-485f-9b1b-af84f534d797\") " Dec 16 15:45:03 crc kubenswrapper[4775]: I1216 15:45:03.988476 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9fce0cc-fd15-485f-9b1b-af84f534d797-config-volume\") pod \"c9fce0cc-fd15-485f-9b1b-af84f534d797\" (UID: \"c9fce0cc-fd15-485f-9b1b-af84f534d797\") " Dec 16 15:45:03 crc kubenswrapper[4775]: I1216 15:45:03.988524 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6phn\" (UniqueName: \"kubernetes.io/projected/c9fce0cc-fd15-485f-9b1b-af84f534d797-kube-api-access-s6phn\") pod \"c9fce0cc-fd15-485f-9b1b-af84f534d797\" (UID: \"c9fce0cc-fd15-485f-9b1b-af84f534d797\") " Dec 16 15:45:03 crc kubenswrapper[4775]: I1216 15:45:03.989251 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9fce0cc-fd15-485f-9b1b-af84f534d797-config-volume" (OuterVolumeSpecName: "config-volume") pod "c9fce0cc-fd15-485f-9b1b-af84f534d797" (UID: "c9fce0cc-fd15-485f-9b1b-af84f534d797"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:45:03 crc kubenswrapper[4775]: I1216 15:45:03.996792 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9fce0cc-fd15-485f-9b1b-af84f534d797-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c9fce0cc-fd15-485f-9b1b-af84f534d797" (UID: "c9fce0cc-fd15-485f-9b1b-af84f534d797"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:45:03 crc kubenswrapper[4775]: I1216 15:45:03.996807 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9fce0cc-fd15-485f-9b1b-af84f534d797-kube-api-access-s6phn" (OuterVolumeSpecName: "kube-api-access-s6phn") pod "c9fce0cc-fd15-485f-9b1b-af84f534d797" (UID: "c9fce0cc-fd15-485f-9b1b-af84f534d797"). InnerVolumeSpecName "kube-api-access-s6phn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:45:04 crc kubenswrapper[4775]: I1216 15:45:04.091240 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9fce0cc-fd15-485f-9b1b-af84f534d797-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 15:45:04 crc kubenswrapper[4775]: I1216 15:45:04.091289 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9fce0cc-fd15-485f-9b1b-af84f534d797-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 15:45:04 crc kubenswrapper[4775]: I1216 15:45:04.091306 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6phn\" (UniqueName: \"kubernetes.io/projected/c9fce0cc-fd15-485f-9b1b-af84f534d797-kube-api-access-s6phn\") on node \"crc\" DevicePath \"\"" Dec 16 15:45:04 crc kubenswrapper[4775]: I1216 15:45:04.358013 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd" event={"ID":"c9fce0cc-fd15-485f-9b1b-af84f534d797","Type":"ContainerDied","Data":"68e3880f668faaa2356f3f36f5eebdf37b27823585f52770e6603b35bbf8a684"} Dec 16 15:45:04 crc kubenswrapper[4775]: I1216 15:45:04.358057 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68e3880f668faaa2356f3f36f5eebdf37b27823585f52770e6603b35bbf8a684" Dec 16 15:45:04 crc kubenswrapper[4775]: I1216 15:45:04.358102 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-p92bd" Dec 16 15:45:04 crc kubenswrapper[4775]: I1216 15:45:04.416479 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh"] Dec 16 15:45:04 crc kubenswrapper[4775]: I1216 15:45:04.424029 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431620-jx5mh"] Dec 16 15:45:05 crc kubenswrapper[4775]: I1216 15:45:05.347413 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="664d0b17-7c71-45b4-b654-b478ba3737e8" path="/var/lib/kubelet/pods/664d0b17-7c71-45b4-b654-b478ba3737e8/volumes" Dec 16 15:45:32 crc kubenswrapper[4775]: I1216 15:45:32.869451 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:45:32 crc kubenswrapper[4775]: I1216 15:45:32.870293 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:45:58 crc kubenswrapper[4775]: I1216 15:45:58.266075 4775 scope.go:117] "RemoveContainer" containerID="e2da98f295d602378ff1162446963de4a3e2fe1ce51c76a9a1f6c5c9d8b1a3a2" Dec 16 15:46:02 crc kubenswrapper[4775]: I1216 15:46:02.869425 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:46:02 crc kubenswrapper[4775]: I1216 15:46:02.870073 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:46:32 crc kubenswrapper[4775]: I1216 15:46:32.869430 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:46:32 crc kubenswrapper[4775]: I1216 15:46:32.869973 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:46:32 crc kubenswrapper[4775]: I1216 15:46:32.870209 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 15:46:32 crc kubenswrapper[4775]: I1216 15:46:32.870766 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6"} pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:46:32 crc kubenswrapper[4775]: I1216 15:46:32.870807 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" containerID="cri-o://7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" gracePeriod=600 Dec 16 15:46:33 crc kubenswrapper[4775]: E1216 15:46:33.006997 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:46:33 crc kubenswrapper[4775]: I1216 15:46:33.134466 4775 generic.go:334] "Generic (PLEG): container finished" podID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" exitCode=0 Dec 16 15:46:33 crc kubenswrapper[4775]: I1216 15:46:33.134540 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerDied","Data":"7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6"} Dec 16 15:46:33 crc kubenswrapper[4775]: I1216 15:46:33.134607 4775 scope.go:117] "RemoveContainer" containerID="4729b3904585254169a82d5de79bd940893d6ca623bcb8a4ed43c5b86e8831f4" Dec 16 15:46:33 crc kubenswrapper[4775]: I1216 15:46:33.135478 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:46:33 crc kubenswrapper[4775]: E1216 15:46:33.136023 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:46:46 crc kubenswrapper[4775]: I1216 15:46:46.337745 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:46:46 crc kubenswrapper[4775]: E1216 15:46:46.338476 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:46:59 crc kubenswrapper[4775]: I1216 15:46:59.340633 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:46:59 crc kubenswrapper[4775]: E1216 15:46:59.341279 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:47:14 crc kubenswrapper[4775]: I1216 15:47:14.342251 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:47:14 crc kubenswrapper[4775]: E1216 15:47:14.346323 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:47:26 crc kubenswrapper[4775]: I1216 15:47:26.337669 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:47:26 crc kubenswrapper[4775]: E1216 15:47:26.338418 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:47:40 crc kubenswrapper[4775]: I1216 15:47:40.338223 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:47:40 crc kubenswrapper[4775]: E1216 15:47:40.339169 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:47:54 crc kubenswrapper[4775]: I1216 15:47:54.338660 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:47:54 crc kubenswrapper[4775]: E1216 15:47:54.339748 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:48:05 crc kubenswrapper[4775]: I1216 15:48:05.344279 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:48:05 crc kubenswrapper[4775]: E1216 15:48:05.346117 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:48:19 crc kubenswrapper[4775]: I1216 15:48:19.338845 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:48:19 crc kubenswrapper[4775]: E1216 15:48:19.339512 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:48:31 crc kubenswrapper[4775]: I1216 15:48:31.337913 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:48:31 crc kubenswrapper[4775]: E1216 15:48:31.338540 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:48:46 crc kubenswrapper[4775]: I1216 15:48:46.338526 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:48:46 crc kubenswrapper[4775]: E1216 15:48:46.339275 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:48:57 crc kubenswrapper[4775]: I1216 15:48:57.337873 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:48:57 crc kubenswrapper[4775]: E1216 15:48:57.338776 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:49:09 crc kubenswrapper[4775]: I1216 15:49:09.337828 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:49:09 crc kubenswrapper[4775]: E1216 15:49:09.338675 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:49:21 crc kubenswrapper[4775]: I1216 15:49:21.338506 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:49:21 crc kubenswrapper[4775]: E1216 15:49:21.339498 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:49:32 crc kubenswrapper[4775]: I1216 15:49:32.337757 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:49:32 crc kubenswrapper[4775]: E1216 15:49:32.338556 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:49:36 crc kubenswrapper[4775]: I1216 15:49:36.365289 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fzbmq"] Dec 16 15:49:36 crc kubenswrapper[4775]: E1216 15:49:36.369877 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9fce0cc-fd15-485f-9b1b-af84f534d797" containerName="collect-profiles" Dec 16 15:49:36 crc kubenswrapper[4775]: I1216 15:49:36.369930 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9fce0cc-fd15-485f-9b1b-af84f534d797" containerName="collect-profiles" Dec 16 15:49:36 crc kubenswrapper[4775]: I1216 15:49:36.370686 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9fce0cc-fd15-485f-9b1b-af84f534d797" containerName="collect-profiles" Dec 16 15:49:36 crc kubenswrapper[4775]: I1216 15:49:36.376372 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzbmq" Dec 16 15:49:36 crc kubenswrapper[4775]: I1216 15:49:36.396309 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fzbmq"] Dec 16 15:49:36 crc kubenswrapper[4775]: I1216 15:49:36.566175 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f738941-f3db-4e67-a1ee-de287c12634a-catalog-content\") pod \"redhat-operators-fzbmq\" (UID: \"4f738941-f3db-4e67-a1ee-de287c12634a\") " pod="openshift-marketplace/redhat-operators-fzbmq" Dec 16 15:49:36 crc kubenswrapper[4775]: I1216 15:49:36.566484 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7swc\" (UniqueName: \"kubernetes.io/projected/4f738941-f3db-4e67-a1ee-de287c12634a-kube-api-access-j7swc\") pod \"redhat-operators-fzbmq\" (UID: \"4f738941-f3db-4e67-a1ee-de287c12634a\") " pod="openshift-marketplace/redhat-operators-fzbmq" Dec 16 15:49:36 crc kubenswrapper[4775]: I1216 15:49:36.566634 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f738941-f3db-4e67-a1ee-de287c12634a-utilities\") pod \"redhat-operators-fzbmq\" (UID: \"4f738941-f3db-4e67-a1ee-de287c12634a\") " pod="openshift-marketplace/redhat-operators-fzbmq" Dec 16 15:49:36 crc kubenswrapper[4775]: I1216 15:49:36.668489 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f738941-f3db-4e67-a1ee-de287c12634a-catalog-content\") pod \"redhat-operators-fzbmq\" (UID: \"4f738941-f3db-4e67-a1ee-de287c12634a\") " pod="openshift-marketplace/redhat-operators-fzbmq" Dec 16 15:49:36 crc kubenswrapper[4775]: I1216 15:49:36.668537 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7swc\" (UniqueName: \"kubernetes.io/projected/4f738941-f3db-4e67-a1ee-de287c12634a-kube-api-access-j7swc\") pod \"redhat-operators-fzbmq\" (UID: \"4f738941-f3db-4e67-a1ee-de287c12634a\") " pod="openshift-marketplace/redhat-operators-fzbmq" Dec 16 15:49:36 crc kubenswrapper[4775]: I1216 15:49:36.668605 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f738941-f3db-4e67-a1ee-de287c12634a-utilities\") pod \"redhat-operators-fzbmq\" (UID: \"4f738941-f3db-4e67-a1ee-de287c12634a\") " pod="openshift-marketplace/redhat-operators-fzbmq" Dec 16 15:49:36 crc kubenswrapper[4775]: I1216 15:49:36.669068 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f738941-f3db-4e67-a1ee-de287c12634a-catalog-content\") pod \"redhat-operators-fzbmq\" (UID: \"4f738941-f3db-4e67-a1ee-de287c12634a\") " pod="openshift-marketplace/redhat-operators-fzbmq" Dec 16 15:49:36 crc kubenswrapper[4775]: I1216 15:49:36.669204 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f738941-f3db-4e67-a1ee-de287c12634a-utilities\") pod \"redhat-operators-fzbmq\" (UID: \"4f738941-f3db-4e67-a1ee-de287c12634a\") " pod="openshift-marketplace/redhat-operators-fzbmq" Dec 16 15:49:36 crc kubenswrapper[4775]: I1216 15:49:36.696325 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7swc\" (UniqueName: \"kubernetes.io/projected/4f738941-f3db-4e67-a1ee-de287c12634a-kube-api-access-j7swc\") pod \"redhat-operators-fzbmq\" (UID: \"4f738941-f3db-4e67-a1ee-de287c12634a\") " pod="openshift-marketplace/redhat-operators-fzbmq" Dec 16 15:49:36 crc kubenswrapper[4775]: I1216 15:49:36.703379 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzbmq" Dec 16 15:49:37 crc kubenswrapper[4775]: I1216 15:49:37.165428 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fzbmq"] Dec 16 15:49:37 crc kubenswrapper[4775]: I1216 15:49:37.769970 4775 generic.go:334] "Generic (PLEG): container finished" podID="4f738941-f3db-4e67-a1ee-de287c12634a" containerID="298679bbd4a89ca4853b48dc7f810f8cfd5dfb0511e71ecadee6d826d1df93b0" exitCode=0 Dec 16 15:49:37 crc kubenswrapper[4775]: I1216 15:49:37.770019 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzbmq" event={"ID":"4f738941-f3db-4e67-a1ee-de287c12634a","Type":"ContainerDied","Data":"298679bbd4a89ca4853b48dc7f810f8cfd5dfb0511e71ecadee6d826d1df93b0"} Dec 16 15:49:37 crc kubenswrapper[4775]: I1216 15:49:37.770332 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzbmq" event={"ID":"4f738941-f3db-4e67-a1ee-de287c12634a","Type":"ContainerStarted","Data":"449ca3457532d02370c9d76bc1d7fe07106f550f797b71978a5de893b0564808"} Dec 16 15:49:37 crc kubenswrapper[4775]: I1216 15:49:37.772408 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 15:49:39 crc kubenswrapper[4775]: I1216 15:49:39.790551 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzbmq" event={"ID":"4f738941-f3db-4e67-a1ee-de287c12634a","Type":"ContainerStarted","Data":"fa45d09d4a312c6b7ed547147f3930547cba6d9ffebe44698eb19368d631f7ef"} Dec 16 15:49:42 crc kubenswrapper[4775]: I1216 15:49:42.817570 4775 generic.go:334] "Generic (PLEG): container finished" podID="4f738941-f3db-4e67-a1ee-de287c12634a" containerID="fa45d09d4a312c6b7ed547147f3930547cba6d9ffebe44698eb19368d631f7ef" exitCode=0 Dec 16 15:49:42 crc kubenswrapper[4775]: I1216 15:49:42.818044 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzbmq" event={"ID":"4f738941-f3db-4e67-a1ee-de287c12634a","Type":"ContainerDied","Data":"fa45d09d4a312c6b7ed547147f3930547cba6d9ffebe44698eb19368d631f7ef"} Dec 16 15:49:43 crc kubenswrapper[4775]: I1216 15:49:43.832279 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzbmq" event={"ID":"4f738941-f3db-4e67-a1ee-de287c12634a","Type":"ContainerStarted","Data":"1f8774aeeb1bcf6587dfcc0518d65d49005d666c69bb3f282a410236b37852b6"} Dec 16 15:49:43 crc kubenswrapper[4775]: I1216 15:49:43.858390 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fzbmq" podStartSLOduration=2.339326602 podStartE2EDuration="7.858372408s" podCreationTimestamp="2025-12-16 15:49:36 +0000 UTC" firstStartedPulling="2025-12-16 15:49:37.772176518 +0000 UTC m=+3302.723255441" lastFinishedPulling="2025-12-16 15:49:43.291222324 +0000 UTC m=+3308.242301247" observedRunningTime="2025-12-16 15:49:43.848844662 +0000 UTC m=+3308.799923605" watchObservedRunningTime="2025-12-16 15:49:43.858372408 +0000 UTC m=+3308.809451331" Dec 16 15:49:46 crc kubenswrapper[4775]: I1216 15:49:46.338088 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:49:46 crc kubenswrapper[4775]: E1216 15:49:46.338622 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:49:46 crc kubenswrapper[4775]: I1216 15:49:46.704064 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fzbmq" Dec 16 15:49:46 crc kubenswrapper[4775]: I1216 15:49:46.704112 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fzbmq" Dec 16 15:49:47 crc kubenswrapper[4775]: I1216 15:49:47.756498 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fzbmq" podUID="4f738941-f3db-4e67-a1ee-de287c12634a" containerName="registry-server" probeResult="failure" output=< Dec 16 15:49:47 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Dec 16 15:49:47 crc kubenswrapper[4775]: > Dec 16 15:49:56 crc kubenswrapper[4775]: I1216 15:49:56.759110 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fzbmq" Dec 16 15:49:56 crc kubenswrapper[4775]: I1216 15:49:56.818416 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fzbmq" Dec 16 15:49:56 crc kubenswrapper[4775]: I1216 15:49:56.995663 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fzbmq"] Dec 16 15:49:57 crc kubenswrapper[4775]: I1216 15:49:57.338321 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:49:57 crc kubenswrapper[4775]: E1216 15:49:57.338561 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:49:57 crc kubenswrapper[4775]: I1216 15:49:57.949346 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fzbmq" podUID="4f738941-f3db-4e67-a1ee-de287c12634a" containerName="registry-server" containerID="cri-o://1f8774aeeb1bcf6587dfcc0518d65d49005d666c69bb3f282a410236b37852b6" gracePeriod=2 Dec 16 15:49:58 crc kubenswrapper[4775]: I1216 15:49:58.483037 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzbmq" Dec 16 15:49:58 crc kubenswrapper[4775]: I1216 15:49:58.576565 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7swc\" (UniqueName: \"kubernetes.io/projected/4f738941-f3db-4e67-a1ee-de287c12634a-kube-api-access-j7swc\") pod \"4f738941-f3db-4e67-a1ee-de287c12634a\" (UID: \"4f738941-f3db-4e67-a1ee-de287c12634a\") " Dec 16 15:49:58 crc kubenswrapper[4775]: I1216 15:49:58.576639 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f738941-f3db-4e67-a1ee-de287c12634a-utilities\") pod \"4f738941-f3db-4e67-a1ee-de287c12634a\" (UID: \"4f738941-f3db-4e67-a1ee-de287c12634a\") " Dec 16 15:49:58 crc kubenswrapper[4775]: I1216 15:49:58.576666 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f738941-f3db-4e67-a1ee-de287c12634a-catalog-content\") pod \"4f738941-f3db-4e67-a1ee-de287c12634a\" (UID: \"4f738941-f3db-4e67-a1ee-de287c12634a\") " Dec 16 15:49:58 crc kubenswrapper[4775]: I1216 15:49:58.577590 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f738941-f3db-4e67-a1ee-de287c12634a-utilities" (OuterVolumeSpecName: "utilities") pod "4f738941-f3db-4e67-a1ee-de287c12634a" (UID: "4f738941-f3db-4e67-a1ee-de287c12634a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:49:58 crc kubenswrapper[4775]: I1216 15:49:58.589964 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f738941-f3db-4e67-a1ee-de287c12634a-kube-api-access-j7swc" (OuterVolumeSpecName: "kube-api-access-j7swc") pod "4f738941-f3db-4e67-a1ee-de287c12634a" (UID: "4f738941-f3db-4e67-a1ee-de287c12634a"). InnerVolumeSpecName "kube-api-access-j7swc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:49:58 crc kubenswrapper[4775]: I1216 15:49:58.678948 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7swc\" (UniqueName: \"kubernetes.io/projected/4f738941-f3db-4e67-a1ee-de287c12634a-kube-api-access-j7swc\") on node \"crc\" DevicePath \"\"" Dec 16 15:49:58 crc kubenswrapper[4775]: I1216 15:49:58.678984 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f738941-f3db-4e67-a1ee-de287c12634a-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:49:58 crc kubenswrapper[4775]: I1216 15:49:58.688666 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f738941-f3db-4e67-a1ee-de287c12634a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f738941-f3db-4e67-a1ee-de287c12634a" (UID: "4f738941-f3db-4e67-a1ee-de287c12634a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:49:58 crc kubenswrapper[4775]: I1216 15:49:58.780534 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f738941-f3db-4e67-a1ee-de287c12634a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:49:58 crc kubenswrapper[4775]: I1216 15:49:58.961657 4775 generic.go:334] "Generic (PLEG): container finished" podID="4f738941-f3db-4e67-a1ee-de287c12634a" containerID="1f8774aeeb1bcf6587dfcc0518d65d49005d666c69bb3f282a410236b37852b6" exitCode=0 Dec 16 15:49:58 crc kubenswrapper[4775]: I1216 15:49:58.961707 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzbmq" Dec 16 15:49:58 crc kubenswrapper[4775]: I1216 15:49:58.961724 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzbmq" event={"ID":"4f738941-f3db-4e67-a1ee-de287c12634a","Type":"ContainerDied","Data":"1f8774aeeb1bcf6587dfcc0518d65d49005d666c69bb3f282a410236b37852b6"} Dec 16 15:49:58 crc kubenswrapper[4775]: I1216 15:49:58.963662 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzbmq" event={"ID":"4f738941-f3db-4e67-a1ee-de287c12634a","Type":"ContainerDied","Data":"449ca3457532d02370c9d76bc1d7fe07106f550f797b71978a5de893b0564808"} Dec 16 15:49:58 crc kubenswrapper[4775]: I1216 15:49:58.963701 4775 scope.go:117] "RemoveContainer" containerID="1f8774aeeb1bcf6587dfcc0518d65d49005d666c69bb3f282a410236b37852b6" Dec 16 15:49:59 crc kubenswrapper[4775]: I1216 15:49:59.018186 4775 scope.go:117] "RemoveContainer" containerID="fa45d09d4a312c6b7ed547147f3930547cba6d9ffebe44698eb19368d631f7ef" Dec 16 15:49:59 crc kubenswrapper[4775]: I1216 15:49:59.025797 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fzbmq"] Dec 16 15:49:59 crc kubenswrapper[4775]: I1216 15:49:59.035620 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fzbmq"] Dec 16 15:49:59 crc kubenswrapper[4775]: I1216 15:49:59.050044 4775 scope.go:117] "RemoveContainer" containerID="298679bbd4a89ca4853b48dc7f810f8cfd5dfb0511e71ecadee6d826d1df93b0" Dec 16 15:49:59 crc kubenswrapper[4775]: I1216 15:49:59.102468 4775 scope.go:117] "RemoveContainer" containerID="1f8774aeeb1bcf6587dfcc0518d65d49005d666c69bb3f282a410236b37852b6" Dec 16 15:49:59 crc kubenswrapper[4775]: E1216 15:49:59.102840 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f8774aeeb1bcf6587dfcc0518d65d49005d666c69bb3f282a410236b37852b6\": container with ID starting with 1f8774aeeb1bcf6587dfcc0518d65d49005d666c69bb3f282a410236b37852b6 not found: ID does not exist" containerID="1f8774aeeb1bcf6587dfcc0518d65d49005d666c69bb3f282a410236b37852b6" Dec 16 15:49:59 crc kubenswrapper[4775]: I1216 15:49:59.102865 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f8774aeeb1bcf6587dfcc0518d65d49005d666c69bb3f282a410236b37852b6"} err="failed to get container status \"1f8774aeeb1bcf6587dfcc0518d65d49005d666c69bb3f282a410236b37852b6\": rpc error: code = NotFound desc = could not find container \"1f8774aeeb1bcf6587dfcc0518d65d49005d666c69bb3f282a410236b37852b6\": container with ID starting with 1f8774aeeb1bcf6587dfcc0518d65d49005d666c69bb3f282a410236b37852b6 not found: ID does not exist" Dec 16 15:49:59 crc kubenswrapper[4775]: I1216 15:49:59.102897 4775 scope.go:117] "RemoveContainer" containerID="fa45d09d4a312c6b7ed547147f3930547cba6d9ffebe44698eb19368d631f7ef" Dec 16 15:49:59 crc kubenswrapper[4775]: E1216 15:49:59.104283 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa45d09d4a312c6b7ed547147f3930547cba6d9ffebe44698eb19368d631f7ef\": container with ID starting with fa45d09d4a312c6b7ed547147f3930547cba6d9ffebe44698eb19368d631f7ef not found: ID does not exist" containerID="fa45d09d4a312c6b7ed547147f3930547cba6d9ffebe44698eb19368d631f7ef" Dec 16 15:49:59 crc kubenswrapper[4775]: I1216 15:49:59.104329 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa45d09d4a312c6b7ed547147f3930547cba6d9ffebe44698eb19368d631f7ef"} err="failed to get container status \"fa45d09d4a312c6b7ed547147f3930547cba6d9ffebe44698eb19368d631f7ef\": rpc error: code = NotFound desc = could not find container \"fa45d09d4a312c6b7ed547147f3930547cba6d9ffebe44698eb19368d631f7ef\": container with ID starting with fa45d09d4a312c6b7ed547147f3930547cba6d9ffebe44698eb19368d631f7ef not found: ID does not exist" Dec 16 15:49:59 crc kubenswrapper[4775]: I1216 15:49:59.104354 4775 scope.go:117] "RemoveContainer" containerID="298679bbd4a89ca4853b48dc7f810f8cfd5dfb0511e71ecadee6d826d1df93b0" Dec 16 15:49:59 crc kubenswrapper[4775]: E1216 15:49:59.104717 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"298679bbd4a89ca4853b48dc7f810f8cfd5dfb0511e71ecadee6d826d1df93b0\": container with ID starting with 298679bbd4a89ca4853b48dc7f810f8cfd5dfb0511e71ecadee6d826d1df93b0 not found: ID does not exist" containerID="298679bbd4a89ca4853b48dc7f810f8cfd5dfb0511e71ecadee6d826d1df93b0" Dec 16 15:49:59 crc kubenswrapper[4775]: I1216 15:49:59.104746 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"298679bbd4a89ca4853b48dc7f810f8cfd5dfb0511e71ecadee6d826d1df93b0"} err="failed to get container status \"298679bbd4a89ca4853b48dc7f810f8cfd5dfb0511e71ecadee6d826d1df93b0\": rpc error: code = NotFound desc = could not find container \"298679bbd4a89ca4853b48dc7f810f8cfd5dfb0511e71ecadee6d826d1df93b0\": container with ID starting with 298679bbd4a89ca4853b48dc7f810f8cfd5dfb0511e71ecadee6d826d1df93b0 not found: ID does not exist" Dec 16 15:49:59 crc kubenswrapper[4775]: I1216 15:49:59.347707 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f738941-f3db-4e67-a1ee-de287c12634a" path="/var/lib/kubelet/pods/4f738941-f3db-4e67-a1ee-de287c12634a/volumes" Dec 16 15:50:12 crc kubenswrapper[4775]: I1216 15:50:12.337471 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:50:12 crc kubenswrapper[4775]: E1216 15:50:12.338356 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:50:25 crc kubenswrapper[4775]: I1216 15:50:25.345600 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:50:25 crc kubenswrapper[4775]: E1216 15:50:25.346404 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:50:36 crc kubenswrapper[4775]: I1216 15:50:36.339395 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:50:36 crc kubenswrapper[4775]: E1216 15:50:36.340727 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:50:49 crc kubenswrapper[4775]: I1216 15:50:49.339058 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:50:49 crc kubenswrapper[4775]: E1216 15:50:49.339827 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:51:00 crc kubenswrapper[4775]: I1216 15:51:00.339036 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:51:00 crc kubenswrapper[4775]: E1216 15:51:00.339830 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:51:14 crc kubenswrapper[4775]: I1216 15:51:14.338342 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:51:14 crc kubenswrapper[4775]: E1216 15:51:14.339191 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:51:26 crc kubenswrapper[4775]: I1216 15:51:26.338497 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:51:26 crc kubenswrapper[4775]: E1216 15:51:26.339753 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:51:41 crc kubenswrapper[4775]: I1216 15:51:41.338097 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:51:41 crc kubenswrapper[4775]: I1216 15:51:41.908422 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerStarted","Data":"e1c8d61f3889d5bd1528924be8d5a0555fad8c946aa20ed5020d09539894e766"} Dec 16 15:51:47 crc kubenswrapper[4775]: I1216 15:51:47.868959 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zj99q"] Dec 16 15:51:47 crc kubenswrapper[4775]: E1216 15:51:47.869848 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f738941-f3db-4e67-a1ee-de287c12634a" containerName="extract-utilities" Dec 16 15:51:47 crc kubenswrapper[4775]: I1216 15:51:47.869859 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f738941-f3db-4e67-a1ee-de287c12634a" containerName="extract-utilities" Dec 16 15:51:47 crc kubenswrapper[4775]: E1216 15:51:47.869876 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f738941-f3db-4e67-a1ee-de287c12634a" containerName="registry-server" Dec 16 15:51:47 crc kubenswrapper[4775]: I1216 15:51:47.869882 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f738941-f3db-4e67-a1ee-de287c12634a" containerName="registry-server" Dec 16 15:51:47 crc kubenswrapper[4775]: E1216 15:51:47.869921 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f738941-f3db-4e67-a1ee-de287c12634a" containerName="extract-content" Dec 16 15:51:47 crc kubenswrapper[4775]: I1216 15:51:47.869927 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f738941-f3db-4e67-a1ee-de287c12634a" containerName="extract-content" Dec 16 15:51:47 crc kubenswrapper[4775]: I1216 15:51:47.870113 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f738941-f3db-4e67-a1ee-de287c12634a" containerName="registry-server" Dec 16 15:51:47 crc kubenswrapper[4775]: I1216 15:51:47.871552 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zj99q" Dec 16 15:51:47 crc kubenswrapper[4775]: I1216 15:51:47.891641 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zj99q"] Dec 16 15:51:47 crc kubenswrapper[4775]: I1216 15:51:47.903316 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d71a09ff-b336-4c9f-9950-4895dfd2192e-catalog-content\") pod \"certified-operators-zj99q\" (UID: \"d71a09ff-b336-4c9f-9950-4895dfd2192e\") " pod="openshift-marketplace/certified-operators-zj99q" Dec 16 15:51:47 crc kubenswrapper[4775]: I1216 15:51:47.903381 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d71a09ff-b336-4c9f-9950-4895dfd2192e-utilities\") pod \"certified-operators-zj99q\" (UID: \"d71a09ff-b336-4c9f-9950-4895dfd2192e\") " pod="openshift-marketplace/certified-operators-zj99q" Dec 16 15:51:47 crc kubenswrapper[4775]: I1216 15:51:47.903422 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txz8l\" (UniqueName: \"kubernetes.io/projected/d71a09ff-b336-4c9f-9950-4895dfd2192e-kube-api-access-txz8l\") pod \"certified-operators-zj99q\" (UID: \"d71a09ff-b336-4c9f-9950-4895dfd2192e\") " pod="openshift-marketplace/certified-operators-zj99q" Dec 16 15:51:48 crc kubenswrapper[4775]: I1216 15:51:48.004420 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d71a09ff-b336-4c9f-9950-4895dfd2192e-catalog-content\") pod \"certified-operators-zj99q\" (UID: \"d71a09ff-b336-4c9f-9950-4895dfd2192e\") " pod="openshift-marketplace/certified-operators-zj99q" Dec 16 15:51:48 crc kubenswrapper[4775]: I1216 15:51:48.004475 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d71a09ff-b336-4c9f-9950-4895dfd2192e-utilities\") pod \"certified-operators-zj99q\" (UID: \"d71a09ff-b336-4c9f-9950-4895dfd2192e\") " pod="openshift-marketplace/certified-operators-zj99q" Dec 16 15:51:48 crc kubenswrapper[4775]: I1216 15:51:48.004506 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txz8l\" (UniqueName: \"kubernetes.io/projected/d71a09ff-b336-4c9f-9950-4895dfd2192e-kube-api-access-txz8l\") pod \"certified-operators-zj99q\" (UID: \"d71a09ff-b336-4c9f-9950-4895dfd2192e\") " pod="openshift-marketplace/certified-operators-zj99q" Dec 16 15:51:48 crc kubenswrapper[4775]: I1216 15:51:48.005168 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d71a09ff-b336-4c9f-9950-4895dfd2192e-catalog-content\") pod \"certified-operators-zj99q\" (UID: \"d71a09ff-b336-4c9f-9950-4895dfd2192e\") " pod="openshift-marketplace/certified-operators-zj99q" Dec 16 15:51:48 crc kubenswrapper[4775]: I1216 15:51:48.005202 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d71a09ff-b336-4c9f-9950-4895dfd2192e-utilities\") pod \"certified-operators-zj99q\" (UID: \"d71a09ff-b336-4c9f-9950-4895dfd2192e\") " pod="openshift-marketplace/certified-operators-zj99q" Dec 16 15:51:48 crc kubenswrapper[4775]: I1216 15:51:48.026394 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txz8l\" (UniqueName: \"kubernetes.io/projected/d71a09ff-b336-4c9f-9950-4895dfd2192e-kube-api-access-txz8l\") pod \"certified-operators-zj99q\" (UID: \"d71a09ff-b336-4c9f-9950-4895dfd2192e\") " pod="openshift-marketplace/certified-operators-zj99q" Dec 16 15:51:48 crc kubenswrapper[4775]: I1216 15:51:48.192122 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zj99q" Dec 16 15:51:48 crc kubenswrapper[4775]: I1216 15:51:48.743603 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zj99q"] Dec 16 15:51:48 crc kubenswrapper[4775]: I1216 15:51:48.978704 4775 generic.go:334] "Generic (PLEG): container finished" podID="d71a09ff-b336-4c9f-9950-4895dfd2192e" containerID="996b87273282136e7ac3ec6c670982dba9ddc2907363628c21085bbf63b96eb0" exitCode=0 Dec 16 15:51:48 crc kubenswrapper[4775]: I1216 15:51:48.978770 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj99q" event={"ID":"d71a09ff-b336-4c9f-9950-4895dfd2192e","Type":"ContainerDied","Data":"996b87273282136e7ac3ec6c670982dba9ddc2907363628c21085bbf63b96eb0"} Dec 16 15:51:48 crc kubenswrapper[4775]: I1216 15:51:48.978836 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj99q" event={"ID":"d71a09ff-b336-4c9f-9950-4895dfd2192e","Type":"ContainerStarted","Data":"6acab732bf2092b1c68b854ac75ca900db9a60d19d87c868b2548882885e0862"} Dec 16 15:51:49 crc kubenswrapper[4775]: I1216 15:51:49.988382 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj99q" event={"ID":"d71a09ff-b336-4c9f-9950-4895dfd2192e","Type":"ContainerStarted","Data":"5fea2a92ca28b8b3ccf16b80a8f40165ae5e03c036a8b68f51a7b4aeadc34189"} Dec 16 15:51:50 crc kubenswrapper[4775]: I1216 15:51:50.998783 4775 generic.go:334] "Generic (PLEG): container finished" podID="d71a09ff-b336-4c9f-9950-4895dfd2192e" containerID="5fea2a92ca28b8b3ccf16b80a8f40165ae5e03c036a8b68f51a7b4aeadc34189" exitCode=0 Dec 16 15:51:50 crc kubenswrapper[4775]: I1216 15:51:50.998908 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj99q" event={"ID":"d71a09ff-b336-4c9f-9950-4895dfd2192e","Type":"ContainerDied","Data":"5fea2a92ca28b8b3ccf16b80a8f40165ae5e03c036a8b68f51a7b4aeadc34189"} Dec 16 15:51:53 crc kubenswrapper[4775]: I1216 15:51:53.022502 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj99q" event={"ID":"d71a09ff-b336-4c9f-9950-4895dfd2192e","Type":"ContainerStarted","Data":"522802e97d96fd3f465ca7753a2e4330ffe17da2b54adc8cc5f8acbb6c4dfdd8"} Dec 16 15:51:53 crc kubenswrapper[4775]: I1216 15:51:53.055997 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zj99q" podStartSLOduration=2.918876005 podStartE2EDuration="6.0559729s" podCreationTimestamp="2025-12-16 15:51:47 +0000 UTC" firstStartedPulling="2025-12-16 15:51:48.980115575 +0000 UTC m=+3433.931194498" lastFinishedPulling="2025-12-16 15:51:52.11721247 +0000 UTC m=+3437.068291393" observedRunningTime="2025-12-16 15:51:53.04821117 +0000 UTC m=+3437.999290103" watchObservedRunningTime="2025-12-16 15:51:53.0559729 +0000 UTC m=+3438.007051833" Dec 16 15:51:58 crc kubenswrapper[4775]: I1216 15:51:58.193395 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zj99q" Dec 16 15:51:58 crc kubenswrapper[4775]: I1216 15:51:58.194309 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zj99q" Dec 16 15:51:58 crc kubenswrapper[4775]: I1216 15:51:58.237299 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zj99q" Dec 16 15:51:59 crc kubenswrapper[4775]: I1216 15:51:59.137622 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zj99q" Dec 16 15:51:59 crc kubenswrapper[4775]: I1216 15:51:59.183270 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zj99q"] Dec 16 15:52:01 crc kubenswrapper[4775]: I1216 15:52:01.109338 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zj99q" podUID="d71a09ff-b336-4c9f-9950-4895dfd2192e" containerName="registry-server" containerID="cri-o://522802e97d96fd3f465ca7753a2e4330ffe17da2b54adc8cc5f8acbb6c4dfdd8" gracePeriod=2 Dec 16 15:52:01 crc kubenswrapper[4775]: I1216 15:52:01.627630 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zj99q" Dec 16 15:52:01 crc kubenswrapper[4775]: I1216 15:52:01.671506 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txz8l\" (UniqueName: \"kubernetes.io/projected/d71a09ff-b336-4c9f-9950-4895dfd2192e-kube-api-access-txz8l\") pod \"d71a09ff-b336-4c9f-9950-4895dfd2192e\" (UID: \"d71a09ff-b336-4c9f-9950-4895dfd2192e\") " Dec 16 15:52:01 crc kubenswrapper[4775]: I1216 15:52:01.671663 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d71a09ff-b336-4c9f-9950-4895dfd2192e-catalog-content\") pod \"d71a09ff-b336-4c9f-9950-4895dfd2192e\" (UID: \"d71a09ff-b336-4c9f-9950-4895dfd2192e\") " Dec 16 15:52:01 crc kubenswrapper[4775]: I1216 15:52:01.671692 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d71a09ff-b336-4c9f-9950-4895dfd2192e-utilities\") pod \"d71a09ff-b336-4c9f-9950-4895dfd2192e\" (UID: \"d71a09ff-b336-4c9f-9950-4895dfd2192e\") " Dec 16 15:52:01 crc kubenswrapper[4775]: I1216 15:52:01.672816 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d71a09ff-b336-4c9f-9950-4895dfd2192e-utilities" (OuterVolumeSpecName: "utilities") pod "d71a09ff-b336-4c9f-9950-4895dfd2192e" (UID: "d71a09ff-b336-4c9f-9950-4895dfd2192e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:52:01 crc kubenswrapper[4775]: I1216 15:52:01.673064 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d71a09ff-b336-4c9f-9950-4895dfd2192e-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:52:01 crc kubenswrapper[4775]: I1216 15:52:01.677556 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d71a09ff-b336-4c9f-9950-4895dfd2192e-kube-api-access-txz8l" (OuterVolumeSpecName: "kube-api-access-txz8l") pod "d71a09ff-b336-4c9f-9950-4895dfd2192e" (UID: "d71a09ff-b336-4c9f-9950-4895dfd2192e"). InnerVolumeSpecName "kube-api-access-txz8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:52:01 crc kubenswrapper[4775]: I1216 15:52:01.722651 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d71a09ff-b336-4c9f-9950-4895dfd2192e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d71a09ff-b336-4c9f-9950-4895dfd2192e" (UID: "d71a09ff-b336-4c9f-9950-4895dfd2192e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:52:01 crc kubenswrapper[4775]: I1216 15:52:01.774624 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d71a09ff-b336-4c9f-9950-4895dfd2192e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:52:01 crc kubenswrapper[4775]: I1216 15:52:01.774656 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txz8l\" (UniqueName: \"kubernetes.io/projected/d71a09ff-b336-4c9f-9950-4895dfd2192e-kube-api-access-txz8l\") on node \"crc\" DevicePath \"\"" Dec 16 15:52:02 crc kubenswrapper[4775]: I1216 15:52:02.120593 4775 generic.go:334] "Generic (PLEG): container finished" podID="d71a09ff-b336-4c9f-9950-4895dfd2192e" containerID="522802e97d96fd3f465ca7753a2e4330ffe17da2b54adc8cc5f8acbb6c4dfdd8" exitCode=0 Dec 16 15:52:02 crc kubenswrapper[4775]: I1216 15:52:02.120671 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zj99q" Dec 16 15:52:02 crc kubenswrapper[4775]: I1216 15:52:02.120692 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj99q" event={"ID":"d71a09ff-b336-4c9f-9950-4895dfd2192e","Type":"ContainerDied","Data":"522802e97d96fd3f465ca7753a2e4330ffe17da2b54adc8cc5f8acbb6c4dfdd8"} Dec 16 15:52:02 crc kubenswrapper[4775]: I1216 15:52:02.121120 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zj99q" event={"ID":"d71a09ff-b336-4c9f-9950-4895dfd2192e","Type":"ContainerDied","Data":"6acab732bf2092b1c68b854ac75ca900db9a60d19d87c868b2548882885e0862"} Dec 16 15:52:02 crc kubenswrapper[4775]: I1216 15:52:02.121140 4775 scope.go:117] "RemoveContainer" containerID="522802e97d96fd3f465ca7753a2e4330ffe17da2b54adc8cc5f8acbb6c4dfdd8" Dec 16 15:52:02 crc kubenswrapper[4775]: I1216 15:52:02.159989 4775 scope.go:117] "RemoveContainer" containerID="5fea2a92ca28b8b3ccf16b80a8f40165ae5e03c036a8b68f51a7b4aeadc34189" Dec 16 15:52:02 crc kubenswrapper[4775]: I1216 15:52:02.171627 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zj99q"] Dec 16 15:52:02 crc kubenswrapper[4775]: I1216 15:52:02.187074 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zj99q"] Dec 16 15:52:02 crc kubenswrapper[4775]: I1216 15:52:02.197587 4775 scope.go:117] "RemoveContainer" containerID="996b87273282136e7ac3ec6c670982dba9ddc2907363628c21085bbf63b96eb0" Dec 16 15:52:02 crc kubenswrapper[4775]: I1216 15:52:02.230925 4775 scope.go:117] "RemoveContainer" containerID="522802e97d96fd3f465ca7753a2e4330ffe17da2b54adc8cc5f8acbb6c4dfdd8" Dec 16 15:52:02 crc kubenswrapper[4775]: E1216 15:52:02.231605 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"522802e97d96fd3f465ca7753a2e4330ffe17da2b54adc8cc5f8acbb6c4dfdd8\": container with ID starting with 522802e97d96fd3f465ca7753a2e4330ffe17da2b54adc8cc5f8acbb6c4dfdd8 not found: ID does not exist" containerID="522802e97d96fd3f465ca7753a2e4330ffe17da2b54adc8cc5f8acbb6c4dfdd8" Dec 16 15:52:02 crc kubenswrapper[4775]: I1216 15:52:02.231665 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"522802e97d96fd3f465ca7753a2e4330ffe17da2b54adc8cc5f8acbb6c4dfdd8"} err="failed to get container status \"522802e97d96fd3f465ca7753a2e4330ffe17da2b54adc8cc5f8acbb6c4dfdd8\": rpc error: code = NotFound desc = could not find container \"522802e97d96fd3f465ca7753a2e4330ffe17da2b54adc8cc5f8acbb6c4dfdd8\": container with ID starting with 522802e97d96fd3f465ca7753a2e4330ffe17da2b54adc8cc5f8acbb6c4dfdd8 not found: ID does not exist" Dec 16 15:52:02 crc kubenswrapper[4775]: I1216 15:52:02.231699 4775 scope.go:117] "RemoveContainer" containerID="5fea2a92ca28b8b3ccf16b80a8f40165ae5e03c036a8b68f51a7b4aeadc34189" Dec 16 15:52:02 crc kubenswrapper[4775]: E1216 15:52:02.232194 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fea2a92ca28b8b3ccf16b80a8f40165ae5e03c036a8b68f51a7b4aeadc34189\": container with ID starting with 5fea2a92ca28b8b3ccf16b80a8f40165ae5e03c036a8b68f51a7b4aeadc34189 not found: ID does not exist" containerID="5fea2a92ca28b8b3ccf16b80a8f40165ae5e03c036a8b68f51a7b4aeadc34189" Dec 16 15:52:02 crc kubenswrapper[4775]: I1216 15:52:02.232236 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fea2a92ca28b8b3ccf16b80a8f40165ae5e03c036a8b68f51a7b4aeadc34189"} err="failed to get container status \"5fea2a92ca28b8b3ccf16b80a8f40165ae5e03c036a8b68f51a7b4aeadc34189\": rpc error: code = NotFound desc = could not find container \"5fea2a92ca28b8b3ccf16b80a8f40165ae5e03c036a8b68f51a7b4aeadc34189\": container with ID starting with 5fea2a92ca28b8b3ccf16b80a8f40165ae5e03c036a8b68f51a7b4aeadc34189 not found: ID does not exist" Dec 16 15:52:02 crc kubenswrapper[4775]: I1216 15:52:02.232263 4775 scope.go:117] "RemoveContainer" containerID="996b87273282136e7ac3ec6c670982dba9ddc2907363628c21085bbf63b96eb0" Dec 16 15:52:02 crc kubenswrapper[4775]: E1216 15:52:02.232631 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"996b87273282136e7ac3ec6c670982dba9ddc2907363628c21085bbf63b96eb0\": container with ID starting with 996b87273282136e7ac3ec6c670982dba9ddc2907363628c21085bbf63b96eb0 not found: ID does not exist" containerID="996b87273282136e7ac3ec6c670982dba9ddc2907363628c21085bbf63b96eb0" Dec 16 15:52:02 crc kubenswrapper[4775]: I1216 15:52:02.232657 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"996b87273282136e7ac3ec6c670982dba9ddc2907363628c21085bbf63b96eb0"} err="failed to get container status \"996b87273282136e7ac3ec6c670982dba9ddc2907363628c21085bbf63b96eb0\": rpc error: code = NotFound desc = could not find container \"996b87273282136e7ac3ec6c670982dba9ddc2907363628c21085bbf63b96eb0\": container with ID starting with 996b87273282136e7ac3ec6c670982dba9ddc2907363628c21085bbf63b96eb0 not found: ID does not exist" Dec 16 15:52:03 crc kubenswrapper[4775]: I1216 15:52:03.349653 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d71a09ff-b336-4c9f-9950-4895dfd2192e" path="/var/lib/kubelet/pods/d71a09ff-b336-4c9f-9950-4895dfd2192e/volumes" Dec 16 15:52:03 crc kubenswrapper[4775]: I1216 15:52:03.878182 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w8kxz"] Dec 16 15:52:03 crc kubenswrapper[4775]: E1216 15:52:03.878633 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d71a09ff-b336-4c9f-9950-4895dfd2192e" containerName="extract-utilities" Dec 16 15:52:03 crc kubenswrapper[4775]: I1216 15:52:03.878650 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71a09ff-b336-4c9f-9950-4895dfd2192e" containerName="extract-utilities" Dec 16 15:52:03 crc kubenswrapper[4775]: E1216 15:52:03.878663 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d71a09ff-b336-4c9f-9950-4895dfd2192e" containerName="extract-content" Dec 16 15:52:03 crc kubenswrapper[4775]: I1216 15:52:03.878669 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71a09ff-b336-4c9f-9950-4895dfd2192e" containerName="extract-content" Dec 16 15:52:03 crc kubenswrapper[4775]: E1216 15:52:03.878695 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d71a09ff-b336-4c9f-9950-4895dfd2192e" containerName="registry-server" Dec 16 15:52:03 crc kubenswrapper[4775]: I1216 15:52:03.878702 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71a09ff-b336-4c9f-9950-4895dfd2192e" containerName="registry-server" Dec 16 15:52:03 crc kubenswrapper[4775]: I1216 15:52:03.878937 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d71a09ff-b336-4c9f-9950-4895dfd2192e" containerName="registry-server" Dec 16 15:52:03 crc kubenswrapper[4775]: I1216 15:52:03.880222 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8kxz" Dec 16 15:52:03 crc kubenswrapper[4775]: I1216 15:52:03.897407 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w8kxz"] Dec 16 15:52:04 crc kubenswrapper[4775]: I1216 15:52:04.027094 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7dc4ef8-185d-4bb9-aacc-c68121ec0968-utilities\") pod \"community-operators-w8kxz\" (UID: \"b7dc4ef8-185d-4bb9-aacc-c68121ec0968\") " pod="openshift-marketplace/community-operators-w8kxz" Dec 16 15:52:04 crc kubenswrapper[4775]: I1216 15:52:04.027429 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ghdm\" (UniqueName: \"kubernetes.io/projected/b7dc4ef8-185d-4bb9-aacc-c68121ec0968-kube-api-access-4ghdm\") pod \"community-operators-w8kxz\" (UID: \"b7dc4ef8-185d-4bb9-aacc-c68121ec0968\") " pod="openshift-marketplace/community-operators-w8kxz" Dec 16 15:52:04 crc kubenswrapper[4775]: I1216 15:52:04.027581 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7dc4ef8-185d-4bb9-aacc-c68121ec0968-catalog-content\") pod \"community-operators-w8kxz\" (UID: \"b7dc4ef8-185d-4bb9-aacc-c68121ec0968\") " pod="openshift-marketplace/community-operators-w8kxz" Dec 16 15:52:04 crc kubenswrapper[4775]: I1216 15:52:04.129973 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7dc4ef8-185d-4bb9-aacc-c68121ec0968-utilities\") pod \"community-operators-w8kxz\" (UID: \"b7dc4ef8-185d-4bb9-aacc-c68121ec0968\") " pod="openshift-marketplace/community-operators-w8kxz" Dec 16 15:52:04 crc kubenswrapper[4775]: I1216 15:52:04.130422 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ghdm\" (UniqueName: \"kubernetes.io/projected/b7dc4ef8-185d-4bb9-aacc-c68121ec0968-kube-api-access-4ghdm\") pod \"community-operators-w8kxz\" (UID: \"b7dc4ef8-185d-4bb9-aacc-c68121ec0968\") " pod="openshift-marketplace/community-operators-w8kxz" Dec 16 15:52:04 crc kubenswrapper[4775]: I1216 15:52:04.130937 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7dc4ef8-185d-4bb9-aacc-c68121ec0968-utilities\") pod \"community-operators-w8kxz\" (UID: \"b7dc4ef8-185d-4bb9-aacc-c68121ec0968\") " pod="openshift-marketplace/community-operators-w8kxz" Dec 16 15:52:04 crc kubenswrapper[4775]: I1216 15:52:04.131081 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7dc4ef8-185d-4bb9-aacc-c68121ec0968-catalog-content\") pod \"community-operators-w8kxz\" (UID: \"b7dc4ef8-185d-4bb9-aacc-c68121ec0968\") " pod="openshift-marketplace/community-operators-w8kxz" Dec 16 15:52:04 crc kubenswrapper[4775]: I1216 15:52:04.131523 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7dc4ef8-185d-4bb9-aacc-c68121ec0968-catalog-content\") pod \"community-operators-w8kxz\" (UID: \"b7dc4ef8-185d-4bb9-aacc-c68121ec0968\") " pod="openshift-marketplace/community-operators-w8kxz" Dec 16 15:52:04 crc kubenswrapper[4775]: I1216 15:52:04.153718 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ghdm\" (UniqueName: \"kubernetes.io/projected/b7dc4ef8-185d-4bb9-aacc-c68121ec0968-kube-api-access-4ghdm\") pod \"community-operators-w8kxz\" (UID: \"b7dc4ef8-185d-4bb9-aacc-c68121ec0968\") " pod="openshift-marketplace/community-operators-w8kxz" Dec 16 15:52:04 crc kubenswrapper[4775]: I1216 15:52:04.204038 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8kxz" Dec 16 15:52:04 crc kubenswrapper[4775]: I1216 15:52:04.781596 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w8kxz"] Dec 16 15:52:04 crc kubenswrapper[4775]: W1216 15:52:04.787743 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7dc4ef8_185d_4bb9_aacc_c68121ec0968.slice/crio-c5cf3e2120353a4989bf561468097c7038fac08bc0489e2cfea5db7f40500a5f WatchSource:0}: Error finding container c5cf3e2120353a4989bf561468097c7038fac08bc0489e2cfea5db7f40500a5f: Status 404 returned error can't find the container with id c5cf3e2120353a4989bf561468097c7038fac08bc0489e2cfea5db7f40500a5f Dec 16 15:52:05 crc kubenswrapper[4775]: I1216 15:52:05.147092 4775 generic.go:334] "Generic (PLEG): container finished" podID="b7dc4ef8-185d-4bb9-aacc-c68121ec0968" containerID="71525ac1a561f727adcaf8d4d62c82b4e072d4ecdcee882314801bec7aee9236" exitCode=0 Dec 16 15:52:05 crc kubenswrapper[4775]: I1216 15:52:05.147151 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8kxz" event={"ID":"b7dc4ef8-185d-4bb9-aacc-c68121ec0968","Type":"ContainerDied","Data":"71525ac1a561f727adcaf8d4d62c82b4e072d4ecdcee882314801bec7aee9236"} Dec 16 15:52:05 crc kubenswrapper[4775]: I1216 15:52:05.147377 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8kxz" event={"ID":"b7dc4ef8-185d-4bb9-aacc-c68121ec0968","Type":"ContainerStarted","Data":"c5cf3e2120353a4989bf561468097c7038fac08bc0489e2cfea5db7f40500a5f"} Dec 16 15:52:06 crc kubenswrapper[4775]: I1216 15:52:06.163388 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8kxz" event={"ID":"b7dc4ef8-185d-4bb9-aacc-c68121ec0968","Type":"ContainerStarted","Data":"80d6b250c51084646c34f6238664bc26e97d99a7323b1f11fbd8a00e557ee572"} Dec 16 15:52:07 crc kubenswrapper[4775]: I1216 15:52:07.175328 4775 generic.go:334] "Generic (PLEG): container finished" podID="b7dc4ef8-185d-4bb9-aacc-c68121ec0968" containerID="80d6b250c51084646c34f6238664bc26e97d99a7323b1f11fbd8a00e557ee572" exitCode=0 Dec 16 15:52:07 crc kubenswrapper[4775]: I1216 15:52:07.175439 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8kxz" event={"ID":"b7dc4ef8-185d-4bb9-aacc-c68121ec0968","Type":"ContainerDied","Data":"80d6b250c51084646c34f6238664bc26e97d99a7323b1f11fbd8a00e557ee572"} Dec 16 15:52:09 crc kubenswrapper[4775]: I1216 15:52:09.202592 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8kxz" event={"ID":"b7dc4ef8-185d-4bb9-aacc-c68121ec0968","Type":"ContainerStarted","Data":"3b1bfe113be98fa9fab7a69f5fae2fdc82ab4ba8dc624b5d573589856fb1ce5e"} Dec 16 15:52:09 crc kubenswrapper[4775]: I1216 15:52:09.231437 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w8kxz" podStartSLOduration=3.201453262 podStartE2EDuration="6.231410576s" podCreationTimestamp="2025-12-16 15:52:03 +0000 UTC" firstStartedPulling="2025-12-16 15:52:05.14907598 +0000 UTC m=+3450.100154903" lastFinishedPulling="2025-12-16 15:52:08.179033274 +0000 UTC m=+3453.130112217" observedRunningTime="2025-12-16 15:52:09.225976208 +0000 UTC m=+3454.177055151" watchObservedRunningTime="2025-12-16 15:52:09.231410576 +0000 UTC m=+3454.182489509" Dec 16 15:52:14 crc kubenswrapper[4775]: I1216 15:52:14.204939 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w8kxz" Dec 16 15:52:14 crc kubenswrapper[4775]: I1216 15:52:14.205534 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w8kxz" Dec 16 15:52:14 crc kubenswrapper[4775]: I1216 15:52:14.273087 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w8kxz" Dec 16 15:52:14 crc kubenswrapper[4775]: I1216 15:52:14.336787 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w8kxz" Dec 16 15:52:14 crc kubenswrapper[4775]: I1216 15:52:14.509528 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w8kxz"] Dec 16 15:52:16 crc kubenswrapper[4775]: I1216 15:52:16.273628 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w8kxz" podUID="b7dc4ef8-185d-4bb9-aacc-c68121ec0968" containerName="registry-server" containerID="cri-o://3b1bfe113be98fa9fab7a69f5fae2fdc82ab4ba8dc624b5d573589856fb1ce5e" gracePeriod=2 Dec 16 15:52:16 crc kubenswrapper[4775]: I1216 15:52:16.774974 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8kxz" Dec 16 15:52:16 crc kubenswrapper[4775]: I1216 15:52:16.904866 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ghdm\" (UniqueName: \"kubernetes.io/projected/b7dc4ef8-185d-4bb9-aacc-c68121ec0968-kube-api-access-4ghdm\") pod \"b7dc4ef8-185d-4bb9-aacc-c68121ec0968\" (UID: \"b7dc4ef8-185d-4bb9-aacc-c68121ec0968\") " Dec 16 15:52:16 crc kubenswrapper[4775]: I1216 15:52:16.904956 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7dc4ef8-185d-4bb9-aacc-c68121ec0968-catalog-content\") pod \"b7dc4ef8-185d-4bb9-aacc-c68121ec0968\" (UID: \"b7dc4ef8-185d-4bb9-aacc-c68121ec0968\") " Dec 16 15:52:16 crc kubenswrapper[4775]: I1216 15:52:16.905459 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7dc4ef8-185d-4bb9-aacc-c68121ec0968-utilities\") pod \"b7dc4ef8-185d-4bb9-aacc-c68121ec0968\" (UID: \"b7dc4ef8-185d-4bb9-aacc-c68121ec0968\") " Dec 16 15:52:16 crc kubenswrapper[4775]: I1216 15:52:16.907113 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7dc4ef8-185d-4bb9-aacc-c68121ec0968-utilities" (OuterVolumeSpecName: "utilities") pod "b7dc4ef8-185d-4bb9-aacc-c68121ec0968" (UID: "b7dc4ef8-185d-4bb9-aacc-c68121ec0968"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:52:16 crc kubenswrapper[4775]: I1216 15:52:16.914863 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7dc4ef8-185d-4bb9-aacc-c68121ec0968-kube-api-access-4ghdm" (OuterVolumeSpecName: "kube-api-access-4ghdm") pod "b7dc4ef8-185d-4bb9-aacc-c68121ec0968" (UID: "b7dc4ef8-185d-4bb9-aacc-c68121ec0968"). InnerVolumeSpecName "kube-api-access-4ghdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:52:16 crc kubenswrapper[4775]: I1216 15:52:16.986173 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7dc4ef8-185d-4bb9-aacc-c68121ec0968-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7dc4ef8-185d-4bb9-aacc-c68121ec0968" (UID: "b7dc4ef8-185d-4bb9-aacc-c68121ec0968"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:52:17 crc kubenswrapper[4775]: I1216 15:52:17.009406 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7dc4ef8-185d-4bb9-aacc-c68121ec0968-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:52:17 crc kubenswrapper[4775]: I1216 15:52:17.009442 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ghdm\" (UniqueName: \"kubernetes.io/projected/b7dc4ef8-185d-4bb9-aacc-c68121ec0968-kube-api-access-4ghdm\") on node \"crc\" DevicePath \"\"" Dec 16 15:52:17 crc kubenswrapper[4775]: I1216 15:52:17.009461 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7dc4ef8-185d-4bb9-aacc-c68121ec0968-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:52:17 crc kubenswrapper[4775]: I1216 15:52:17.284735 4775 generic.go:334] "Generic (PLEG): container finished" podID="b7dc4ef8-185d-4bb9-aacc-c68121ec0968" containerID="3b1bfe113be98fa9fab7a69f5fae2fdc82ab4ba8dc624b5d573589856fb1ce5e" exitCode=0 Dec 16 15:52:17 crc kubenswrapper[4775]: I1216 15:52:17.284791 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8kxz" Dec 16 15:52:17 crc kubenswrapper[4775]: I1216 15:52:17.284810 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8kxz" event={"ID":"b7dc4ef8-185d-4bb9-aacc-c68121ec0968","Type":"ContainerDied","Data":"3b1bfe113be98fa9fab7a69f5fae2fdc82ab4ba8dc624b5d573589856fb1ce5e"} Dec 16 15:52:17 crc kubenswrapper[4775]: I1216 15:52:17.285251 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8kxz" event={"ID":"b7dc4ef8-185d-4bb9-aacc-c68121ec0968","Type":"ContainerDied","Data":"c5cf3e2120353a4989bf561468097c7038fac08bc0489e2cfea5db7f40500a5f"} Dec 16 15:52:17 crc kubenswrapper[4775]: I1216 15:52:17.285275 4775 scope.go:117] "RemoveContainer" containerID="3b1bfe113be98fa9fab7a69f5fae2fdc82ab4ba8dc624b5d573589856fb1ce5e" Dec 16 15:52:17 crc kubenswrapper[4775]: I1216 15:52:17.312705 4775 scope.go:117] "RemoveContainer" containerID="80d6b250c51084646c34f6238664bc26e97d99a7323b1f11fbd8a00e557ee572" Dec 16 15:52:17 crc kubenswrapper[4775]: I1216 15:52:17.328530 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w8kxz"] Dec 16 15:52:17 crc kubenswrapper[4775]: I1216 15:52:17.351314 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w8kxz"] Dec 16 15:52:17 crc kubenswrapper[4775]: I1216 15:52:17.370074 4775 scope.go:117] "RemoveContainer" containerID="71525ac1a561f727adcaf8d4d62c82b4e072d4ecdcee882314801bec7aee9236" Dec 16 15:52:17 crc kubenswrapper[4775]: I1216 15:52:17.393362 4775 scope.go:117] "RemoveContainer" containerID="3b1bfe113be98fa9fab7a69f5fae2fdc82ab4ba8dc624b5d573589856fb1ce5e" Dec 16 15:52:17 crc kubenswrapper[4775]: E1216 15:52:17.393796 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b1bfe113be98fa9fab7a69f5fae2fdc82ab4ba8dc624b5d573589856fb1ce5e\": container with ID starting with 3b1bfe113be98fa9fab7a69f5fae2fdc82ab4ba8dc624b5d573589856fb1ce5e not found: ID does not exist" containerID="3b1bfe113be98fa9fab7a69f5fae2fdc82ab4ba8dc624b5d573589856fb1ce5e" Dec 16 15:52:17 crc kubenswrapper[4775]: I1216 15:52:17.393832 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b1bfe113be98fa9fab7a69f5fae2fdc82ab4ba8dc624b5d573589856fb1ce5e"} err="failed to get container status \"3b1bfe113be98fa9fab7a69f5fae2fdc82ab4ba8dc624b5d573589856fb1ce5e\": rpc error: code = NotFound desc = could not find container \"3b1bfe113be98fa9fab7a69f5fae2fdc82ab4ba8dc624b5d573589856fb1ce5e\": container with ID starting with 3b1bfe113be98fa9fab7a69f5fae2fdc82ab4ba8dc624b5d573589856fb1ce5e not found: ID does not exist" Dec 16 15:52:17 crc kubenswrapper[4775]: I1216 15:52:17.393851 4775 scope.go:117] "RemoveContainer" containerID="80d6b250c51084646c34f6238664bc26e97d99a7323b1f11fbd8a00e557ee572" Dec 16 15:52:17 crc kubenswrapper[4775]: E1216 15:52:17.394139 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80d6b250c51084646c34f6238664bc26e97d99a7323b1f11fbd8a00e557ee572\": container with ID starting with 80d6b250c51084646c34f6238664bc26e97d99a7323b1f11fbd8a00e557ee572 not found: ID does not exist" containerID="80d6b250c51084646c34f6238664bc26e97d99a7323b1f11fbd8a00e557ee572" Dec 16 15:52:17 crc kubenswrapper[4775]: I1216 15:52:17.394168 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d6b250c51084646c34f6238664bc26e97d99a7323b1f11fbd8a00e557ee572"} err="failed to get container status \"80d6b250c51084646c34f6238664bc26e97d99a7323b1f11fbd8a00e557ee572\": rpc error: code = NotFound desc = could not find container \"80d6b250c51084646c34f6238664bc26e97d99a7323b1f11fbd8a00e557ee572\": container with ID starting with 80d6b250c51084646c34f6238664bc26e97d99a7323b1f11fbd8a00e557ee572 not found: ID does not exist" Dec 16 15:52:17 crc kubenswrapper[4775]: I1216 15:52:17.394181 4775 scope.go:117] "RemoveContainer" containerID="71525ac1a561f727adcaf8d4d62c82b4e072d4ecdcee882314801bec7aee9236" Dec 16 15:52:17 crc kubenswrapper[4775]: E1216 15:52:17.394454 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71525ac1a561f727adcaf8d4d62c82b4e072d4ecdcee882314801bec7aee9236\": container with ID starting with 71525ac1a561f727adcaf8d4d62c82b4e072d4ecdcee882314801bec7aee9236 not found: ID does not exist" containerID="71525ac1a561f727adcaf8d4d62c82b4e072d4ecdcee882314801bec7aee9236" Dec 16 15:52:17 crc kubenswrapper[4775]: I1216 15:52:17.394480 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71525ac1a561f727adcaf8d4d62c82b4e072d4ecdcee882314801bec7aee9236"} err="failed to get container status \"71525ac1a561f727adcaf8d4d62c82b4e072d4ecdcee882314801bec7aee9236\": rpc error: code = NotFound desc = could not find container \"71525ac1a561f727adcaf8d4d62c82b4e072d4ecdcee882314801bec7aee9236\": container with ID starting with 71525ac1a561f727adcaf8d4d62c82b4e072d4ecdcee882314801bec7aee9236 not found: ID does not exist" Dec 16 15:52:19 crc kubenswrapper[4775]: I1216 15:52:19.348300 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7dc4ef8-185d-4bb9-aacc-c68121ec0968" path="/var/lib/kubelet/pods/b7dc4ef8-185d-4bb9-aacc-c68121ec0968/volumes" Dec 16 15:52:21 crc kubenswrapper[4775]: I1216 15:52:21.127487 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vs5cx"] Dec 16 15:52:21 crc kubenswrapper[4775]: E1216 15:52:21.128295 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7dc4ef8-185d-4bb9-aacc-c68121ec0968" containerName="extract-content" Dec 16 15:52:21 crc kubenswrapper[4775]: I1216 15:52:21.128311 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7dc4ef8-185d-4bb9-aacc-c68121ec0968" containerName="extract-content" Dec 16 15:52:21 crc kubenswrapper[4775]: E1216 15:52:21.128329 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7dc4ef8-185d-4bb9-aacc-c68121ec0968" containerName="registry-server" Dec 16 15:52:21 crc kubenswrapper[4775]: I1216 15:52:21.128338 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7dc4ef8-185d-4bb9-aacc-c68121ec0968" containerName="registry-server" Dec 16 15:52:21 crc kubenswrapper[4775]: E1216 15:52:21.128378 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7dc4ef8-185d-4bb9-aacc-c68121ec0968" containerName="extract-utilities" Dec 16 15:52:21 crc kubenswrapper[4775]: I1216 15:52:21.128388 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7dc4ef8-185d-4bb9-aacc-c68121ec0968" containerName="extract-utilities" Dec 16 15:52:21 crc kubenswrapper[4775]: I1216 15:52:21.128690 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7dc4ef8-185d-4bb9-aacc-c68121ec0968" containerName="registry-server" Dec 16 15:52:21 crc kubenswrapper[4775]: I1216 15:52:21.130585 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vs5cx" Dec 16 15:52:21 crc kubenswrapper[4775]: I1216 15:52:21.143237 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vs5cx"] Dec 16 15:52:21 crc kubenswrapper[4775]: I1216 15:52:21.186110 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72db5a2d-07ca-4213-83b9-3d55eaa696b7-utilities\") pod \"redhat-marketplace-vs5cx\" (UID: \"72db5a2d-07ca-4213-83b9-3d55eaa696b7\") " pod="openshift-marketplace/redhat-marketplace-vs5cx" Dec 16 15:52:21 crc kubenswrapper[4775]: I1216 15:52:21.186507 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sms2g\" (UniqueName: \"kubernetes.io/projected/72db5a2d-07ca-4213-83b9-3d55eaa696b7-kube-api-access-sms2g\") pod \"redhat-marketplace-vs5cx\" (UID: \"72db5a2d-07ca-4213-83b9-3d55eaa696b7\") " pod="openshift-marketplace/redhat-marketplace-vs5cx" Dec 16 15:52:21 crc kubenswrapper[4775]: I1216 15:52:21.186707 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72db5a2d-07ca-4213-83b9-3d55eaa696b7-catalog-content\") pod \"redhat-marketplace-vs5cx\" (UID: \"72db5a2d-07ca-4213-83b9-3d55eaa696b7\") " pod="openshift-marketplace/redhat-marketplace-vs5cx" Dec 16 15:52:21 crc kubenswrapper[4775]: I1216 15:52:21.288274 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72db5a2d-07ca-4213-83b9-3d55eaa696b7-catalog-content\") pod \"redhat-marketplace-vs5cx\" (UID: \"72db5a2d-07ca-4213-83b9-3d55eaa696b7\") " pod="openshift-marketplace/redhat-marketplace-vs5cx" Dec 16 15:52:21 crc kubenswrapper[4775]: I1216 15:52:21.288364 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72db5a2d-07ca-4213-83b9-3d55eaa696b7-utilities\") pod \"redhat-marketplace-vs5cx\" (UID: \"72db5a2d-07ca-4213-83b9-3d55eaa696b7\") " pod="openshift-marketplace/redhat-marketplace-vs5cx" Dec 16 15:52:21 crc kubenswrapper[4775]: I1216 15:52:21.288394 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sms2g\" (UniqueName: \"kubernetes.io/projected/72db5a2d-07ca-4213-83b9-3d55eaa696b7-kube-api-access-sms2g\") pod \"redhat-marketplace-vs5cx\" (UID: \"72db5a2d-07ca-4213-83b9-3d55eaa696b7\") " pod="openshift-marketplace/redhat-marketplace-vs5cx" Dec 16 15:52:21 crc kubenswrapper[4775]: I1216 15:52:21.288959 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72db5a2d-07ca-4213-83b9-3d55eaa696b7-catalog-content\") pod \"redhat-marketplace-vs5cx\" (UID: \"72db5a2d-07ca-4213-83b9-3d55eaa696b7\") " pod="openshift-marketplace/redhat-marketplace-vs5cx" Dec 16 15:52:21 crc kubenswrapper[4775]: I1216 15:52:21.289165 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72db5a2d-07ca-4213-83b9-3d55eaa696b7-utilities\") pod \"redhat-marketplace-vs5cx\" (UID: \"72db5a2d-07ca-4213-83b9-3d55eaa696b7\") " pod="openshift-marketplace/redhat-marketplace-vs5cx" Dec 16 15:52:21 crc kubenswrapper[4775]: I1216 15:52:21.310534 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sms2g\" (UniqueName: \"kubernetes.io/projected/72db5a2d-07ca-4213-83b9-3d55eaa696b7-kube-api-access-sms2g\") pod \"redhat-marketplace-vs5cx\" (UID: \"72db5a2d-07ca-4213-83b9-3d55eaa696b7\") " pod="openshift-marketplace/redhat-marketplace-vs5cx" Dec 16 15:52:21 crc kubenswrapper[4775]: I1216 15:52:21.462045 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vs5cx" Dec 16 15:52:21 crc kubenswrapper[4775]: I1216 15:52:21.948516 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vs5cx"] Dec 16 15:52:22 crc kubenswrapper[4775]: I1216 15:52:22.331216 4775 generic.go:334] "Generic (PLEG): container finished" podID="72db5a2d-07ca-4213-83b9-3d55eaa696b7" containerID="0deecb112f16a66e3dde63983e4232d7df489d95952303b9bd31a45f3e423951" exitCode=0 Dec 16 15:52:22 crc kubenswrapper[4775]: I1216 15:52:22.331301 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vs5cx" event={"ID":"72db5a2d-07ca-4213-83b9-3d55eaa696b7","Type":"ContainerDied","Data":"0deecb112f16a66e3dde63983e4232d7df489d95952303b9bd31a45f3e423951"} Dec 16 15:52:22 crc kubenswrapper[4775]: I1216 15:52:22.331718 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vs5cx" event={"ID":"72db5a2d-07ca-4213-83b9-3d55eaa696b7","Type":"ContainerStarted","Data":"79b594b3e740933e184f29c798dfafaf973482a770bf5b20a0909e1b53ed1c05"} Dec 16 15:52:24 crc kubenswrapper[4775]: I1216 15:52:24.352047 4775 generic.go:334] "Generic (PLEG): container finished" podID="72db5a2d-07ca-4213-83b9-3d55eaa696b7" containerID="592f946d6a589d5525f99f89c77428440b89d3baf6ad5dbe9a28479c3911f1ce" exitCode=0 Dec 16 15:52:24 crc kubenswrapper[4775]: I1216 15:52:24.352140 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vs5cx" event={"ID":"72db5a2d-07ca-4213-83b9-3d55eaa696b7","Type":"ContainerDied","Data":"592f946d6a589d5525f99f89c77428440b89d3baf6ad5dbe9a28479c3911f1ce"} Dec 16 15:52:25 crc kubenswrapper[4775]: I1216 15:52:25.363755 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vs5cx" event={"ID":"72db5a2d-07ca-4213-83b9-3d55eaa696b7","Type":"ContainerStarted","Data":"03e6dc2f7859ccdbbc0be1416d5814115aedb0c3cf8298ae3efc51ea3e133c6f"} Dec 16 15:52:25 crc kubenswrapper[4775]: I1216 15:52:25.399155 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vs5cx" podStartSLOduration=1.8656268219999999 podStartE2EDuration="4.399126002s" podCreationTimestamp="2025-12-16 15:52:21 +0000 UTC" firstStartedPulling="2025-12-16 15:52:22.335236697 +0000 UTC m=+3467.286315620" lastFinishedPulling="2025-12-16 15:52:24.868735867 +0000 UTC m=+3469.819814800" observedRunningTime="2025-12-16 15:52:25.388790073 +0000 UTC m=+3470.339868996" watchObservedRunningTime="2025-12-16 15:52:25.399126002 +0000 UTC m=+3470.350204945" Dec 16 15:52:31 crc kubenswrapper[4775]: I1216 15:52:31.463667 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vs5cx" Dec 16 15:52:31 crc kubenswrapper[4775]: I1216 15:52:31.464480 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vs5cx" Dec 16 15:52:31 crc kubenswrapper[4775]: I1216 15:52:31.523765 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vs5cx" Dec 16 15:52:32 crc kubenswrapper[4775]: I1216 15:52:32.514647 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vs5cx" Dec 16 15:52:32 crc kubenswrapper[4775]: I1216 15:52:32.579041 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vs5cx"] Dec 16 15:52:34 crc kubenswrapper[4775]: I1216 15:52:34.461493 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vs5cx" podUID="72db5a2d-07ca-4213-83b9-3d55eaa696b7" containerName="registry-server" containerID="cri-o://03e6dc2f7859ccdbbc0be1416d5814115aedb0c3cf8298ae3efc51ea3e133c6f" gracePeriod=2 Dec 16 15:52:34 crc kubenswrapper[4775]: I1216 15:52:34.976855 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vs5cx" Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.079479 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72db5a2d-07ca-4213-83b9-3d55eaa696b7-catalog-content\") pod \"72db5a2d-07ca-4213-83b9-3d55eaa696b7\" (UID: \"72db5a2d-07ca-4213-83b9-3d55eaa696b7\") " Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.079537 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sms2g\" (UniqueName: \"kubernetes.io/projected/72db5a2d-07ca-4213-83b9-3d55eaa696b7-kube-api-access-sms2g\") pod \"72db5a2d-07ca-4213-83b9-3d55eaa696b7\" (UID: \"72db5a2d-07ca-4213-83b9-3d55eaa696b7\") " Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.079562 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72db5a2d-07ca-4213-83b9-3d55eaa696b7-utilities\") pod \"72db5a2d-07ca-4213-83b9-3d55eaa696b7\" (UID: \"72db5a2d-07ca-4213-83b9-3d55eaa696b7\") " Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.080399 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72db5a2d-07ca-4213-83b9-3d55eaa696b7-utilities" (OuterVolumeSpecName: "utilities") pod "72db5a2d-07ca-4213-83b9-3d55eaa696b7" (UID: "72db5a2d-07ca-4213-83b9-3d55eaa696b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.080797 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72db5a2d-07ca-4213-83b9-3d55eaa696b7-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.088188 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72db5a2d-07ca-4213-83b9-3d55eaa696b7-kube-api-access-sms2g" (OuterVolumeSpecName: "kube-api-access-sms2g") pod "72db5a2d-07ca-4213-83b9-3d55eaa696b7" (UID: "72db5a2d-07ca-4213-83b9-3d55eaa696b7"). InnerVolumeSpecName "kube-api-access-sms2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.102152 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72db5a2d-07ca-4213-83b9-3d55eaa696b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72db5a2d-07ca-4213-83b9-3d55eaa696b7" (UID: "72db5a2d-07ca-4213-83b9-3d55eaa696b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.182472 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72db5a2d-07ca-4213-83b9-3d55eaa696b7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.182512 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sms2g\" (UniqueName: \"kubernetes.io/projected/72db5a2d-07ca-4213-83b9-3d55eaa696b7-kube-api-access-sms2g\") on node \"crc\" DevicePath \"\"" Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.470359 4775 generic.go:334] "Generic (PLEG): container finished" podID="72db5a2d-07ca-4213-83b9-3d55eaa696b7" containerID="03e6dc2f7859ccdbbc0be1416d5814115aedb0c3cf8298ae3efc51ea3e133c6f" exitCode=0 Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.470414 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vs5cx" Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.470411 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vs5cx" event={"ID":"72db5a2d-07ca-4213-83b9-3d55eaa696b7","Type":"ContainerDied","Data":"03e6dc2f7859ccdbbc0be1416d5814115aedb0c3cf8298ae3efc51ea3e133c6f"} Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.470465 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vs5cx" event={"ID":"72db5a2d-07ca-4213-83b9-3d55eaa696b7","Type":"ContainerDied","Data":"79b594b3e740933e184f29c798dfafaf973482a770bf5b20a0909e1b53ed1c05"} Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.470487 4775 scope.go:117] "RemoveContainer" containerID="03e6dc2f7859ccdbbc0be1416d5814115aedb0c3cf8298ae3efc51ea3e133c6f" Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.501967 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vs5cx"] Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.504557 4775 scope.go:117] "RemoveContainer" containerID="592f946d6a589d5525f99f89c77428440b89d3baf6ad5dbe9a28479c3911f1ce" Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.510661 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vs5cx"] Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.522201 4775 scope.go:117] "RemoveContainer" containerID="0deecb112f16a66e3dde63983e4232d7df489d95952303b9bd31a45f3e423951" Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.584373 4775 scope.go:117] "RemoveContainer" containerID="03e6dc2f7859ccdbbc0be1416d5814115aedb0c3cf8298ae3efc51ea3e133c6f" Dec 16 15:52:35 crc kubenswrapper[4775]: E1216 15:52:35.584952 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03e6dc2f7859ccdbbc0be1416d5814115aedb0c3cf8298ae3efc51ea3e133c6f\": container with ID starting with 03e6dc2f7859ccdbbc0be1416d5814115aedb0c3cf8298ae3efc51ea3e133c6f not found: ID does not exist" containerID="03e6dc2f7859ccdbbc0be1416d5814115aedb0c3cf8298ae3efc51ea3e133c6f" Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.585076 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03e6dc2f7859ccdbbc0be1416d5814115aedb0c3cf8298ae3efc51ea3e133c6f"} err="failed to get container status \"03e6dc2f7859ccdbbc0be1416d5814115aedb0c3cf8298ae3efc51ea3e133c6f\": rpc error: code = NotFound desc = could not find container \"03e6dc2f7859ccdbbc0be1416d5814115aedb0c3cf8298ae3efc51ea3e133c6f\": container with ID starting with 03e6dc2f7859ccdbbc0be1416d5814115aedb0c3cf8298ae3efc51ea3e133c6f not found: ID does not exist" Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.585181 4775 scope.go:117] "RemoveContainer" containerID="592f946d6a589d5525f99f89c77428440b89d3baf6ad5dbe9a28479c3911f1ce" Dec 16 15:52:35 crc kubenswrapper[4775]: E1216 15:52:35.585590 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"592f946d6a589d5525f99f89c77428440b89d3baf6ad5dbe9a28479c3911f1ce\": container with ID starting with 592f946d6a589d5525f99f89c77428440b89d3baf6ad5dbe9a28479c3911f1ce not found: ID does not exist" containerID="592f946d6a589d5525f99f89c77428440b89d3baf6ad5dbe9a28479c3911f1ce" Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.585623 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"592f946d6a589d5525f99f89c77428440b89d3baf6ad5dbe9a28479c3911f1ce"} err="failed to get container status \"592f946d6a589d5525f99f89c77428440b89d3baf6ad5dbe9a28479c3911f1ce\": rpc error: code = NotFound desc = could not find container \"592f946d6a589d5525f99f89c77428440b89d3baf6ad5dbe9a28479c3911f1ce\": container with ID starting with 592f946d6a589d5525f99f89c77428440b89d3baf6ad5dbe9a28479c3911f1ce not found: ID does not exist" Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.585643 4775 scope.go:117] "RemoveContainer" containerID="0deecb112f16a66e3dde63983e4232d7df489d95952303b9bd31a45f3e423951" Dec 16 15:52:35 crc kubenswrapper[4775]: E1216 15:52:35.585912 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0deecb112f16a66e3dde63983e4232d7df489d95952303b9bd31a45f3e423951\": container with ID starting with 0deecb112f16a66e3dde63983e4232d7df489d95952303b9bd31a45f3e423951 not found: ID does not exist" containerID="0deecb112f16a66e3dde63983e4232d7df489d95952303b9bd31a45f3e423951" Dec 16 15:52:35 crc kubenswrapper[4775]: I1216 15:52:35.585930 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0deecb112f16a66e3dde63983e4232d7df489d95952303b9bd31a45f3e423951"} err="failed to get container status \"0deecb112f16a66e3dde63983e4232d7df489d95952303b9bd31a45f3e423951\": rpc error: code = NotFound desc = could not find container \"0deecb112f16a66e3dde63983e4232d7df489d95952303b9bd31a45f3e423951\": container with ID starting with 0deecb112f16a66e3dde63983e4232d7df489d95952303b9bd31a45f3e423951 not found: ID does not exist" Dec 16 15:52:37 crc kubenswrapper[4775]: I1216 15:52:37.360277 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72db5a2d-07ca-4213-83b9-3d55eaa696b7" path="/var/lib/kubelet/pods/72db5a2d-07ca-4213-83b9-3d55eaa696b7/volumes" Dec 16 15:53:46 crc kubenswrapper[4775]: I1216 15:53:46.187567 4775 generic.go:334] "Generic (PLEG): container finished" podID="81e92dde-6675-4a19-a619-52358e91c49c" containerID="4ba8033fe6f5ca4d275e4f546e31ae7d6fd3f7e1ac0b78d8563ff579820a2be9" exitCode=0 Dec 16 15:53:46 crc kubenswrapper[4775]: I1216 15:53:46.187730 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"81e92dde-6675-4a19-a619-52358e91c49c","Type":"ContainerDied","Data":"4ba8033fe6f5ca4d275e4f546e31ae7d6fd3f7e1ac0b78d8563ff579820a2be9"} Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.815200 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.864856 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/81e92dde-6675-4a19-a619-52358e91c49c-openstack-config-secret\") pod \"81e92dde-6675-4a19-a619-52358e91c49c\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.864978 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81e92dde-6675-4a19-a619-52358e91c49c-ssh-key\") pod \"81e92dde-6675-4a19-a619-52358e91c49c\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.865008 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwgrh\" (UniqueName: \"kubernetes.io/projected/81e92dde-6675-4a19-a619-52358e91c49c-kube-api-access-bwgrh\") pod \"81e92dde-6675-4a19-a619-52358e91c49c\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.865048 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/81e92dde-6675-4a19-a619-52358e91c49c-test-operator-ephemeral-workdir\") pod \"81e92dde-6675-4a19-a619-52358e91c49c\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.865125 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/81e92dde-6675-4a19-a619-52358e91c49c-openstack-config\") pod \"81e92dde-6675-4a19-a619-52358e91c49c\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.865168 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/81e92dde-6675-4a19-a619-52358e91c49c-ca-certs\") pod \"81e92dde-6675-4a19-a619-52358e91c49c\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.865204 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"81e92dde-6675-4a19-a619-52358e91c49c\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.865270 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/81e92dde-6675-4a19-a619-52358e91c49c-test-operator-ephemeral-temporary\") pod \"81e92dde-6675-4a19-a619-52358e91c49c\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.865292 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81e92dde-6675-4a19-a619-52358e91c49c-config-data\") pod \"81e92dde-6675-4a19-a619-52358e91c49c\" (UID: \"81e92dde-6675-4a19-a619-52358e91c49c\") " Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.866373 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81e92dde-6675-4a19-a619-52358e91c49c-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "81e92dde-6675-4a19-a619-52358e91c49c" (UID: "81e92dde-6675-4a19-a619-52358e91c49c"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.866537 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81e92dde-6675-4a19-a619-52358e91c49c-config-data" (OuterVolumeSpecName: "config-data") pod "81e92dde-6675-4a19-a619-52358e91c49c" (UID: "81e92dde-6675-4a19-a619-52358e91c49c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.871334 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "81e92dde-6675-4a19-a619-52358e91c49c" (UID: "81e92dde-6675-4a19-a619-52358e91c49c"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.875790 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81e92dde-6675-4a19-a619-52358e91c49c-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "81e92dde-6675-4a19-a619-52358e91c49c" (UID: "81e92dde-6675-4a19-a619-52358e91c49c"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.877151 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e92dde-6675-4a19-a619-52358e91c49c-kube-api-access-bwgrh" (OuterVolumeSpecName: "kube-api-access-bwgrh") pod "81e92dde-6675-4a19-a619-52358e91c49c" (UID: "81e92dde-6675-4a19-a619-52358e91c49c"). InnerVolumeSpecName "kube-api-access-bwgrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.901672 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e92dde-6675-4a19-a619-52358e91c49c-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "81e92dde-6675-4a19-a619-52358e91c49c" (UID: "81e92dde-6675-4a19-a619-52358e91c49c"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.907977 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e92dde-6675-4a19-a619-52358e91c49c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "81e92dde-6675-4a19-a619-52358e91c49c" (UID: "81e92dde-6675-4a19-a619-52358e91c49c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.908218 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e92dde-6675-4a19-a619-52358e91c49c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "81e92dde-6675-4a19-a619-52358e91c49c" (UID: "81e92dde-6675-4a19-a619-52358e91c49c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.929356 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81e92dde-6675-4a19-a619-52358e91c49c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "81e92dde-6675-4a19-a619-52358e91c49c" (UID: "81e92dde-6675-4a19-a619-52358e91c49c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.967792 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/81e92dde-6675-4a19-a619-52358e91c49c-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.967833 4775 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/81e92dde-6675-4a19-a619-52358e91c49c-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.967876 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.967907 4775 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/81e92dde-6675-4a19-a619-52358e91c49c-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.967921 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81e92dde-6675-4a19-a619-52358e91c49c-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.967934 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/81e92dde-6675-4a19-a619-52358e91c49c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.967947 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81e92dde-6675-4a19-a619-52358e91c49c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.967957 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwgrh\" (UniqueName: \"kubernetes.io/projected/81e92dde-6675-4a19-a619-52358e91c49c-kube-api-access-bwgrh\") on node \"crc\" DevicePath \"\"" Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.967968 4775 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/81e92dde-6675-4a19-a619-52358e91c49c-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 16 15:53:47 crc kubenswrapper[4775]: I1216 15:53:47.986346 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 16 15:53:48 crc kubenswrapper[4775]: I1216 15:53:48.068984 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 16 15:53:48 crc kubenswrapper[4775]: I1216 15:53:48.215489 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"81e92dde-6675-4a19-a619-52358e91c49c","Type":"ContainerDied","Data":"0a918477c1e177fb308656593047d223bbfa10d44c07e47d2b5c39f5bf151685"} Dec 16 15:53:48 crc kubenswrapper[4775]: I1216 15:53:48.215524 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a918477c1e177fb308656593047d223bbfa10d44c07e47d2b5c39f5bf151685" Dec 16 15:53:48 crc kubenswrapper[4775]: I1216 15:53:48.215587 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 16 15:53:51 crc kubenswrapper[4775]: I1216 15:53:51.507186 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 16 15:53:51 crc kubenswrapper[4775]: E1216 15:53:51.509043 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e92dde-6675-4a19-a619-52358e91c49c" containerName="tempest-tests-tempest-tests-runner" Dec 16 15:53:51 crc kubenswrapper[4775]: I1216 15:53:51.509066 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e92dde-6675-4a19-a619-52358e91c49c" containerName="tempest-tests-tempest-tests-runner" Dec 16 15:53:51 crc kubenswrapper[4775]: E1216 15:53:51.509086 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72db5a2d-07ca-4213-83b9-3d55eaa696b7" containerName="extract-utilities" Dec 16 15:53:51 crc kubenswrapper[4775]: I1216 15:53:51.509093 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="72db5a2d-07ca-4213-83b9-3d55eaa696b7" containerName="extract-utilities" Dec 16 15:53:51 crc kubenswrapper[4775]: E1216 15:53:51.509108 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72db5a2d-07ca-4213-83b9-3d55eaa696b7" containerName="registry-server" Dec 16 15:53:51 crc kubenswrapper[4775]: I1216 15:53:51.509115 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="72db5a2d-07ca-4213-83b9-3d55eaa696b7" containerName="registry-server" Dec 16 15:53:51 crc kubenswrapper[4775]: E1216 15:53:51.509134 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72db5a2d-07ca-4213-83b9-3d55eaa696b7" containerName="extract-content" Dec 16 15:53:51 crc kubenswrapper[4775]: I1216 15:53:51.509140 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="72db5a2d-07ca-4213-83b9-3d55eaa696b7" containerName="extract-content" Dec 16 15:53:51 crc kubenswrapper[4775]: I1216 15:53:51.509398 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="72db5a2d-07ca-4213-83b9-3d55eaa696b7" containerName="registry-server" Dec 16 15:53:51 crc kubenswrapper[4775]: I1216 15:53:51.509418 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e92dde-6675-4a19-a619-52358e91c49c" containerName="tempest-tests-tempest-tests-runner" Dec 16 15:53:51 crc kubenswrapper[4775]: I1216 15:53:51.520999 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 16 15:53:51 crc kubenswrapper[4775]: I1216 15:53:51.521178 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 15:53:51 crc kubenswrapper[4775]: I1216 15:53:51.524062 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-q28vx" Dec 16 15:53:51 crc kubenswrapper[4775]: I1216 15:53:51.542094 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"974b82d4-0fe0-449c-89d3-619ac869f974\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 15:53:51 crc kubenswrapper[4775]: I1216 15:53:51.542185 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7tjt\" (UniqueName: \"kubernetes.io/projected/974b82d4-0fe0-449c-89d3-619ac869f974-kube-api-access-p7tjt\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"974b82d4-0fe0-449c-89d3-619ac869f974\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 15:53:51 crc kubenswrapper[4775]: I1216 15:53:51.644112 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"974b82d4-0fe0-449c-89d3-619ac869f974\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 15:53:51 crc kubenswrapper[4775]: I1216 15:53:51.644468 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7tjt\" (UniqueName: \"kubernetes.io/projected/974b82d4-0fe0-449c-89d3-619ac869f974-kube-api-access-p7tjt\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"974b82d4-0fe0-449c-89d3-619ac869f974\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 15:53:51 crc kubenswrapper[4775]: I1216 15:53:51.644619 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"974b82d4-0fe0-449c-89d3-619ac869f974\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 15:53:51 crc kubenswrapper[4775]: I1216 15:53:51.663076 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7tjt\" (UniqueName: \"kubernetes.io/projected/974b82d4-0fe0-449c-89d3-619ac869f974-kube-api-access-p7tjt\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"974b82d4-0fe0-449c-89d3-619ac869f974\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 15:53:51 crc kubenswrapper[4775]: I1216 15:53:51.667250 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"974b82d4-0fe0-449c-89d3-619ac869f974\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 15:53:51 crc kubenswrapper[4775]: I1216 15:53:51.858927 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 15:53:52 crc kubenswrapper[4775]: I1216 15:53:52.330538 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 16 15:53:53 crc kubenswrapper[4775]: I1216 15:53:53.268275 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"974b82d4-0fe0-449c-89d3-619ac869f974","Type":"ContainerStarted","Data":"3fafce236cebcc25f660e5a2c5d90bb127c5f51ae4bdf626eb15d5c34159fa4a"} Dec 16 15:53:54 crc kubenswrapper[4775]: I1216 15:53:54.284560 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"974b82d4-0fe0-449c-89d3-619ac869f974","Type":"ContainerStarted","Data":"815f43a3617999e47467487a8e0367c2c9bee7cbcd46dcd989181b928de954f6"} Dec 16 15:53:54 crc kubenswrapper[4775]: I1216 15:53:54.313231 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.170151112 podStartE2EDuration="3.313212434s" podCreationTimestamp="2025-12-16 15:53:51 +0000 UTC" firstStartedPulling="2025-12-16 15:53:52.328753518 +0000 UTC m=+3557.279832471" lastFinishedPulling="2025-12-16 15:53:53.47181487 +0000 UTC m=+3558.422893793" observedRunningTime="2025-12-16 15:53:54.302325566 +0000 UTC m=+3559.253404509" watchObservedRunningTime="2025-12-16 15:53:54.313212434 +0000 UTC m=+3559.264291377" Dec 16 15:54:02 crc kubenswrapper[4775]: I1216 15:54:02.869234 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:54:02 crc kubenswrapper[4775]: I1216 15:54:02.869867 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:54:18 crc kubenswrapper[4775]: I1216 15:54:18.197444 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kqrn5/must-gather-q7z9n"] Dec 16 15:54:18 crc kubenswrapper[4775]: I1216 15:54:18.199687 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqrn5/must-gather-q7z9n" Dec 16 15:54:18 crc kubenswrapper[4775]: I1216 15:54:18.202698 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kqrn5"/"openshift-service-ca.crt" Dec 16 15:54:18 crc kubenswrapper[4775]: I1216 15:54:18.207675 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kqrn5"/"kube-root-ca.crt" Dec 16 15:54:18 crc kubenswrapper[4775]: I1216 15:54:18.208386 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kqrn5"/"default-dockercfg-p7vqw" Dec 16 15:54:18 crc kubenswrapper[4775]: I1216 15:54:18.230841 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqrn5/must-gather-q7z9n"] Dec 16 15:54:18 crc kubenswrapper[4775]: I1216 15:54:18.320668 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbsq8\" (UniqueName: \"kubernetes.io/projected/bf793d59-ff27-4cfb-b547-a15d08fc0367-kube-api-access-rbsq8\") pod \"must-gather-q7z9n\" (UID: \"bf793d59-ff27-4cfb-b547-a15d08fc0367\") " pod="openshift-must-gather-kqrn5/must-gather-q7z9n" Dec 16 15:54:18 crc kubenswrapper[4775]: I1216 15:54:18.320915 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bf793d59-ff27-4cfb-b547-a15d08fc0367-must-gather-output\") pod \"must-gather-q7z9n\" (UID: \"bf793d59-ff27-4cfb-b547-a15d08fc0367\") " pod="openshift-must-gather-kqrn5/must-gather-q7z9n" Dec 16 15:54:18 crc kubenswrapper[4775]: I1216 15:54:18.422932 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bf793d59-ff27-4cfb-b547-a15d08fc0367-must-gather-output\") pod \"must-gather-q7z9n\" (UID: \"bf793d59-ff27-4cfb-b547-a15d08fc0367\") " pod="openshift-must-gather-kqrn5/must-gather-q7z9n" Dec 16 15:54:18 crc kubenswrapper[4775]: I1216 15:54:18.423372 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbsq8\" (UniqueName: \"kubernetes.io/projected/bf793d59-ff27-4cfb-b547-a15d08fc0367-kube-api-access-rbsq8\") pod \"must-gather-q7z9n\" (UID: \"bf793d59-ff27-4cfb-b547-a15d08fc0367\") " pod="openshift-must-gather-kqrn5/must-gather-q7z9n" Dec 16 15:54:18 crc kubenswrapper[4775]: I1216 15:54:18.423416 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bf793d59-ff27-4cfb-b547-a15d08fc0367-must-gather-output\") pod \"must-gather-q7z9n\" (UID: \"bf793d59-ff27-4cfb-b547-a15d08fc0367\") " pod="openshift-must-gather-kqrn5/must-gather-q7z9n" Dec 16 15:54:18 crc kubenswrapper[4775]: I1216 15:54:18.440225 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbsq8\" (UniqueName: \"kubernetes.io/projected/bf793d59-ff27-4cfb-b547-a15d08fc0367-kube-api-access-rbsq8\") pod \"must-gather-q7z9n\" (UID: \"bf793d59-ff27-4cfb-b547-a15d08fc0367\") " pod="openshift-must-gather-kqrn5/must-gather-q7z9n" Dec 16 15:54:18 crc kubenswrapper[4775]: I1216 15:54:18.518327 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqrn5/must-gather-q7z9n" Dec 16 15:54:18 crc kubenswrapper[4775]: I1216 15:54:18.964148 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqrn5/must-gather-q7z9n"] Dec 16 15:54:18 crc kubenswrapper[4775]: W1216 15:54:18.965692 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf793d59_ff27_4cfb_b547_a15d08fc0367.slice/crio-bcffcff2795bd3150fed8f1f57475b78e552a8748a221e28c56a113924b5b581 WatchSource:0}: Error finding container bcffcff2795bd3150fed8f1f57475b78e552a8748a221e28c56a113924b5b581: Status 404 returned error can't find the container with id bcffcff2795bd3150fed8f1f57475b78e552a8748a221e28c56a113924b5b581 Dec 16 15:54:19 crc kubenswrapper[4775]: I1216 15:54:19.556811 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqrn5/must-gather-q7z9n" event={"ID":"bf793d59-ff27-4cfb-b547-a15d08fc0367","Type":"ContainerStarted","Data":"bcffcff2795bd3150fed8f1f57475b78e552a8748a221e28c56a113924b5b581"} Dec 16 15:54:25 crc kubenswrapper[4775]: I1216 15:54:25.607514 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqrn5/must-gather-q7z9n" event={"ID":"bf793d59-ff27-4cfb-b547-a15d08fc0367","Type":"ContainerStarted","Data":"8d49b382db5514949da76b0c0427c57250e74ef8bcc162314cbae4d103125c55"} Dec 16 15:54:26 crc kubenswrapper[4775]: I1216 15:54:26.635036 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqrn5/must-gather-q7z9n" event={"ID":"bf793d59-ff27-4cfb-b547-a15d08fc0367","Type":"ContainerStarted","Data":"4d5106ae799fba21c2d4dbc5de0a988fb75c578f4acac150921ad480e88537f0"} Dec 16 15:54:26 crc kubenswrapper[4775]: I1216 15:54:26.657406 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kqrn5/must-gather-q7z9n" podStartSLOduration=2.488630077 podStartE2EDuration="8.657382806s" podCreationTimestamp="2025-12-16 15:54:18 +0000 UTC" firstStartedPulling="2025-12-16 15:54:18.968224525 +0000 UTC m=+3583.919303448" lastFinishedPulling="2025-12-16 15:54:25.136977214 +0000 UTC m=+3590.088056177" observedRunningTime="2025-12-16 15:54:26.649141079 +0000 UTC m=+3591.600220002" watchObservedRunningTime="2025-12-16 15:54:26.657382806 +0000 UTC m=+3591.608461739" Dec 16 15:54:29 crc kubenswrapper[4775]: I1216 15:54:29.081137 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kqrn5/crc-debug-mvwkg"] Dec 16 15:54:29 crc kubenswrapper[4775]: I1216 15:54:29.083130 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqrn5/crc-debug-mvwkg" Dec 16 15:54:29 crc kubenswrapper[4775]: I1216 15:54:29.155034 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1209be8a-e69d-4323-ab71-86d0285f34e4-host\") pod \"crc-debug-mvwkg\" (UID: \"1209be8a-e69d-4323-ab71-86d0285f34e4\") " pod="openshift-must-gather-kqrn5/crc-debug-mvwkg" Dec 16 15:54:29 crc kubenswrapper[4775]: I1216 15:54:29.155459 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjc9q\" (UniqueName: \"kubernetes.io/projected/1209be8a-e69d-4323-ab71-86d0285f34e4-kube-api-access-jjc9q\") pod \"crc-debug-mvwkg\" (UID: \"1209be8a-e69d-4323-ab71-86d0285f34e4\") " pod="openshift-must-gather-kqrn5/crc-debug-mvwkg" Dec 16 15:54:29 crc kubenswrapper[4775]: I1216 15:54:29.256727 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1209be8a-e69d-4323-ab71-86d0285f34e4-host\") pod \"crc-debug-mvwkg\" (UID: \"1209be8a-e69d-4323-ab71-86d0285f34e4\") " pod="openshift-must-gather-kqrn5/crc-debug-mvwkg" Dec 16 15:54:29 crc kubenswrapper[4775]: I1216 15:54:29.256813 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjc9q\" (UniqueName: \"kubernetes.io/projected/1209be8a-e69d-4323-ab71-86d0285f34e4-kube-api-access-jjc9q\") pod \"crc-debug-mvwkg\" (UID: \"1209be8a-e69d-4323-ab71-86d0285f34e4\") " pod="openshift-must-gather-kqrn5/crc-debug-mvwkg" Dec 16 15:54:29 crc kubenswrapper[4775]: I1216 15:54:29.257058 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1209be8a-e69d-4323-ab71-86d0285f34e4-host\") pod \"crc-debug-mvwkg\" (UID: \"1209be8a-e69d-4323-ab71-86d0285f34e4\") " pod="openshift-must-gather-kqrn5/crc-debug-mvwkg" Dec 16 15:54:29 crc kubenswrapper[4775]: I1216 15:54:29.296052 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjc9q\" (UniqueName: \"kubernetes.io/projected/1209be8a-e69d-4323-ab71-86d0285f34e4-kube-api-access-jjc9q\") pod \"crc-debug-mvwkg\" (UID: \"1209be8a-e69d-4323-ab71-86d0285f34e4\") " pod="openshift-must-gather-kqrn5/crc-debug-mvwkg" Dec 16 15:54:29 crc kubenswrapper[4775]: I1216 15:54:29.400902 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqrn5/crc-debug-mvwkg" Dec 16 15:54:29 crc kubenswrapper[4775]: W1216 15:54:29.457622 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1209be8a_e69d_4323_ab71_86d0285f34e4.slice/crio-018bf573bbf86114954e709c9baf1e85e63c49c38000617899f80d2543837139 WatchSource:0}: Error finding container 018bf573bbf86114954e709c9baf1e85e63c49c38000617899f80d2543837139: Status 404 returned error can't find the container with id 018bf573bbf86114954e709c9baf1e85e63c49c38000617899f80d2543837139 Dec 16 15:54:29 crc kubenswrapper[4775]: I1216 15:54:29.665382 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqrn5/crc-debug-mvwkg" event={"ID":"1209be8a-e69d-4323-ab71-86d0285f34e4","Type":"ContainerStarted","Data":"018bf573bbf86114954e709c9baf1e85e63c49c38000617899f80d2543837139"} Dec 16 15:54:32 crc kubenswrapper[4775]: I1216 15:54:32.869148 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:54:32 crc kubenswrapper[4775]: I1216 15:54:32.869698 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:54:41 crc kubenswrapper[4775]: I1216 15:54:41.769179 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqrn5/crc-debug-mvwkg" event={"ID":"1209be8a-e69d-4323-ab71-86d0285f34e4","Type":"ContainerStarted","Data":"2692c3d7920252eec3388e2da208e081057391885af8bff644c39f8a88abac7a"} Dec 16 15:54:41 crc kubenswrapper[4775]: I1216 15:54:41.796266 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kqrn5/crc-debug-mvwkg" podStartSLOduration=0.984365898 podStartE2EDuration="12.796243602s" podCreationTimestamp="2025-12-16 15:54:29 +0000 UTC" firstStartedPulling="2025-12-16 15:54:29.460355063 +0000 UTC m=+3594.411433986" lastFinishedPulling="2025-12-16 15:54:41.272232767 +0000 UTC m=+3606.223311690" observedRunningTime="2025-12-16 15:54:41.790181912 +0000 UTC m=+3606.741260845" watchObservedRunningTime="2025-12-16 15:54:41.796243602 +0000 UTC m=+3606.747322525" Dec 16 15:55:02 crc kubenswrapper[4775]: I1216 15:55:02.869390 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:55:02 crc kubenswrapper[4775]: I1216 15:55:02.869856 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:55:02 crc kubenswrapper[4775]: I1216 15:55:02.869934 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 15:55:02 crc kubenswrapper[4775]: I1216 15:55:02.870675 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e1c8d61f3889d5bd1528924be8d5a0555fad8c946aa20ed5020d09539894e766"} pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:55:02 crc kubenswrapper[4775]: I1216 15:55:02.870727 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" containerID="cri-o://e1c8d61f3889d5bd1528924be8d5a0555fad8c946aa20ed5020d09539894e766" gracePeriod=600 Dec 16 15:55:03 crc kubenswrapper[4775]: I1216 15:55:03.964259 4775 generic.go:334] "Generic (PLEG): container finished" podID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerID="e1c8d61f3889d5bd1528924be8d5a0555fad8c946aa20ed5020d09539894e766" exitCode=0 Dec 16 15:55:03 crc kubenswrapper[4775]: I1216 15:55:03.964330 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerDied","Data":"e1c8d61f3889d5bd1528924be8d5a0555fad8c946aa20ed5020d09539894e766"} Dec 16 15:55:03 crc kubenswrapper[4775]: I1216 15:55:03.964853 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerStarted","Data":"a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf"} Dec 16 15:55:03 crc kubenswrapper[4775]: I1216 15:55:03.964910 4775 scope.go:117] "RemoveContainer" containerID="7ae1b1f0394efcd9a5ea10c77c3a3aa03815f0ba5c9427502c51f94e757b3ef6" Dec 16 15:55:25 crc kubenswrapper[4775]: I1216 15:55:25.249424 4775 generic.go:334] "Generic (PLEG): container finished" podID="1209be8a-e69d-4323-ab71-86d0285f34e4" containerID="2692c3d7920252eec3388e2da208e081057391885af8bff644c39f8a88abac7a" exitCode=0 Dec 16 15:55:25 crc kubenswrapper[4775]: I1216 15:55:25.249527 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqrn5/crc-debug-mvwkg" event={"ID":"1209be8a-e69d-4323-ab71-86d0285f34e4","Type":"ContainerDied","Data":"2692c3d7920252eec3388e2da208e081057391885af8bff644c39f8a88abac7a"} Dec 16 15:55:26 crc kubenswrapper[4775]: I1216 15:55:26.376036 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqrn5/crc-debug-mvwkg" Dec 16 15:55:26 crc kubenswrapper[4775]: I1216 15:55:26.397906 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjc9q\" (UniqueName: \"kubernetes.io/projected/1209be8a-e69d-4323-ab71-86d0285f34e4-kube-api-access-jjc9q\") pod \"1209be8a-e69d-4323-ab71-86d0285f34e4\" (UID: \"1209be8a-e69d-4323-ab71-86d0285f34e4\") " Dec 16 15:55:26 crc kubenswrapper[4775]: I1216 15:55:26.397977 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1209be8a-e69d-4323-ab71-86d0285f34e4-host\") pod \"1209be8a-e69d-4323-ab71-86d0285f34e4\" (UID: \"1209be8a-e69d-4323-ab71-86d0285f34e4\") " Dec 16 15:55:26 crc kubenswrapper[4775]: I1216 15:55:26.399072 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1209be8a-e69d-4323-ab71-86d0285f34e4-host" (OuterVolumeSpecName: "host") pod "1209be8a-e69d-4323-ab71-86d0285f34e4" (UID: "1209be8a-e69d-4323-ab71-86d0285f34e4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:55:26 crc kubenswrapper[4775]: I1216 15:55:26.405513 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1209be8a-e69d-4323-ab71-86d0285f34e4-kube-api-access-jjc9q" (OuterVolumeSpecName: "kube-api-access-jjc9q") pod "1209be8a-e69d-4323-ab71-86d0285f34e4" (UID: "1209be8a-e69d-4323-ab71-86d0285f34e4"). InnerVolumeSpecName "kube-api-access-jjc9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:55:26 crc kubenswrapper[4775]: I1216 15:55:26.417928 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kqrn5/crc-debug-mvwkg"] Dec 16 15:55:26 crc kubenswrapper[4775]: I1216 15:55:26.429757 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kqrn5/crc-debug-mvwkg"] Dec 16 15:55:26 crc kubenswrapper[4775]: I1216 15:55:26.500577 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjc9q\" (UniqueName: \"kubernetes.io/projected/1209be8a-e69d-4323-ab71-86d0285f34e4-kube-api-access-jjc9q\") on node \"crc\" DevicePath \"\"" Dec 16 15:55:26 crc kubenswrapper[4775]: I1216 15:55:26.500611 4775 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1209be8a-e69d-4323-ab71-86d0285f34e4-host\") on node \"crc\" DevicePath \"\"" Dec 16 15:55:27 crc kubenswrapper[4775]: I1216 15:55:27.270617 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="018bf573bbf86114954e709c9baf1e85e63c49c38000617899f80d2543837139" Dec 16 15:55:27 crc kubenswrapper[4775]: I1216 15:55:27.270731 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqrn5/crc-debug-mvwkg" Dec 16 15:55:27 crc kubenswrapper[4775]: I1216 15:55:27.359337 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1209be8a-e69d-4323-ab71-86d0285f34e4" path="/var/lib/kubelet/pods/1209be8a-e69d-4323-ab71-86d0285f34e4/volumes" Dec 16 15:55:27 crc kubenswrapper[4775]: I1216 15:55:27.649723 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kqrn5/crc-debug-m5pv8"] Dec 16 15:55:27 crc kubenswrapper[4775]: E1216 15:55:27.650194 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1209be8a-e69d-4323-ab71-86d0285f34e4" containerName="container-00" Dec 16 15:55:27 crc kubenswrapper[4775]: I1216 15:55:27.650208 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1209be8a-e69d-4323-ab71-86d0285f34e4" containerName="container-00" Dec 16 15:55:27 crc kubenswrapper[4775]: I1216 15:55:27.650502 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1209be8a-e69d-4323-ab71-86d0285f34e4" containerName="container-00" Dec 16 15:55:27 crc kubenswrapper[4775]: I1216 15:55:27.651442 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqrn5/crc-debug-m5pv8" Dec 16 15:55:27 crc kubenswrapper[4775]: I1216 15:55:27.724827 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhz46\" (UniqueName: \"kubernetes.io/projected/57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7-kube-api-access-dhz46\") pod \"crc-debug-m5pv8\" (UID: \"57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7\") " pod="openshift-must-gather-kqrn5/crc-debug-m5pv8" Dec 16 15:55:27 crc kubenswrapper[4775]: I1216 15:55:27.724916 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7-host\") pod \"crc-debug-m5pv8\" (UID: \"57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7\") " pod="openshift-must-gather-kqrn5/crc-debug-m5pv8" Dec 16 15:55:27 crc kubenswrapper[4775]: I1216 15:55:27.826698 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhz46\" (UniqueName: \"kubernetes.io/projected/57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7-kube-api-access-dhz46\") pod \"crc-debug-m5pv8\" (UID: \"57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7\") " pod="openshift-must-gather-kqrn5/crc-debug-m5pv8" Dec 16 15:55:27 crc kubenswrapper[4775]: I1216 15:55:27.826830 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7-host\") pod \"crc-debug-m5pv8\" (UID: \"57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7\") " pod="openshift-must-gather-kqrn5/crc-debug-m5pv8" Dec 16 15:55:27 crc kubenswrapper[4775]: I1216 15:55:27.827043 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7-host\") pod \"crc-debug-m5pv8\" (UID: \"57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7\") " pod="openshift-must-gather-kqrn5/crc-debug-m5pv8" Dec 16 15:55:27 crc kubenswrapper[4775]: I1216 15:55:27.852022 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhz46\" (UniqueName: \"kubernetes.io/projected/57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7-kube-api-access-dhz46\") pod \"crc-debug-m5pv8\" (UID: \"57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7\") " pod="openshift-must-gather-kqrn5/crc-debug-m5pv8" Dec 16 15:55:27 crc kubenswrapper[4775]: I1216 15:55:27.968004 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqrn5/crc-debug-m5pv8" Dec 16 15:55:28 crc kubenswrapper[4775]: I1216 15:55:28.285143 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqrn5/crc-debug-m5pv8" event={"ID":"57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7","Type":"ContainerStarted","Data":"88a013bf84a782a1ce3257866b7eeb7dc2467d78c9ae77f3aea48e98e49ecdf8"} Dec 16 15:55:28 crc kubenswrapper[4775]: I1216 15:55:28.285522 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqrn5/crc-debug-m5pv8" event={"ID":"57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7","Type":"ContainerStarted","Data":"50d54769834a49ac8b44a8d264110dc97db0a0f40e2677466f8fb21e32a41def"} Dec 16 15:55:28 crc kubenswrapper[4775]: I1216 15:55:28.311551 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kqrn5/crc-debug-m5pv8" podStartSLOduration=1.311525129 podStartE2EDuration="1.311525129s" podCreationTimestamp="2025-12-16 15:55:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:55:28.304585142 +0000 UTC m=+3653.255664065" watchObservedRunningTime="2025-12-16 15:55:28.311525129 +0000 UTC m=+3653.262604052" Dec 16 15:55:29 crc kubenswrapper[4775]: I1216 15:55:29.293952 4775 generic.go:334] "Generic (PLEG): container finished" podID="57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7" containerID="88a013bf84a782a1ce3257866b7eeb7dc2467d78c9ae77f3aea48e98e49ecdf8" exitCode=0 Dec 16 15:55:29 crc kubenswrapper[4775]: I1216 15:55:29.294026 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqrn5/crc-debug-m5pv8" event={"ID":"57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7","Type":"ContainerDied","Data":"88a013bf84a782a1ce3257866b7eeb7dc2467d78c9ae77f3aea48e98e49ecdf8"} Dec 16 15:55:30 crc kubenswrapper[4775]: I1216 15:55:30.432339 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqrn5/crc-debug-m5pv8" Dec 16 15:55:30 crc kubenswrapper[4775]: I1216 15:55:30.465599 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kqrn5/crc-debug-m5pv8"] Dec 16 15:55:30 crc kubenswrapper[4775]: I1216 15:55:30.475269 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kqrn5/crc-debug-m5pv8"] Dec 16 15:55:30 crc kubenswrapper[4775]: I1216 15:55:30.575840 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhz46\" (UniqueName: \"kubernetes.io/projected/57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7-kube-api-access-dhz46\") pod \"57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7\" (UID: \"57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7\") " Dec 16 15:55:30 crc kubenswrapper[4775]: I1216 15:55:30.576097 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7-host\") pod \"57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7\" (UID: \"57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7\") " Dec 16 15:55:30 crc kubenswrapper[4775]: I1216 15:55:30.576259 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7-host" (OuterVolumeSpecName: "host") pod "57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7" (UID: "57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:55:30 crc kubenswrapper[4775]: I1216 15:55:30.576548 4775 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7-host\") on node \"crc\" DevicePath \"\"" Dec 16 15:55:30 crc kubenswrapper[4775]: I1216 15:55:30.581740 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7-kube-api-access-dhz46" (OuterVolumeSpecName: "kube-api-access-dhz46") pod "57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7" (UID: "57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7"). InnerVolumeSpecName "kube-api-access-dhz46". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:55:30 crc kubenswrapper[4775]: I1216 15:55:30.678732 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhz46\" (UniqueName: \"kubernetes.io/projected/57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7-kube-api-access-dhz46\") on node \"crc\" DevicePath \"\"" Dec 16 15:55:31 crc kubenswrapper[4775]: I1216 15:55:31.311923 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50d54769834a49ac8b44a8d264110dc97db0a0f40e2677466f8fb21e32a41def" Dec 16 15:55:31 crc kubenswrapper[4775]: I1216 15:55:31.312069 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqrn5/crc-debug-m5pv8" Dec 16 15:55:31 crc kubenswrapper[4775]: I1216 15:55:31.370023 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7" path="/var/lib/kubelet/pods/57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7/volumes" Dec 16 15:55:31 crc kubenswrapper[4775]: I1216 15:55:31.657725 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kqrn5/crc-debug-r7l6g"] Dec 16 15:55:31 crc kubenswrapper[4775]: E1216 15:55:31.658148 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7" containerName="container-00" Dec 16 15:55:31 crc kubenswrapper[4775]: I1216 15:55:31.658162 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7" containerName="container-00" Dec 16 15:55:31 crc kubenswrapper[4775]: I1216 15:55:31.658356 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e3a0a6-34a9-4dbd-8c07-f561ac3f21e7" containerName="container-00" Dec 16 15:55:31 crc kubenswrapper[4775]: I1216 15:55:31.658984 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqrn5/crc-debug-r7l6g" Dec 16 15:55:31 crc kubenswrapper[4775]: I1216 15:55:31.698079 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2-host\") pod \"crc-debug-r7l6g\" (UID: \"c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2\") " pod="openshift-must-gather-kqrn5/crc-debug-r7l6g" Dec 16 15:55:31 crc kubenswrapper[4775]: I1216 15:55:31.698384 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9wdz\" (UniqueName: \"kubernetes.io/projected/c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2-kube-api-access-g9wdz\") pod \"crc-debug-r7l6g\" (UID: \"c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2\") " pod="openshift-must-gather-kqrn5/crc-debug-r7l6g" Dec 16 15:55:31 crc kubenswrapper[4775]: I1216 15:55:31.800370 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2-host\") pod \"crc-debug-r7l6g\" (UID: \"c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2\") " pod="openshift-must-gather-kqrn5/crc-debug-r7l6g" Dec 16 15:55:31 crc kubenswrapper[4775]: I1216 15:55:31.800473 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9wdz\" (UniqueName: \"kubernetes.io/projected/c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2-kube-api-access-g9wdz\") pod \"crc-debug-r7l6g\" (UID: \"c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2\") " pod="openshift-must-gather-kqrn5/crc-debug-r7l6g" Dec 16 15:55:31 crc kubenswrapper[4775]: I1216 15:55:31.800470 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2-host\") pod \"crc-debug-r7l6g\" (UID: \"c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2\") " pod="openshift-must-gather-kqrn5/crc-debug-r7l6g" Dec 16 15:55:31 crc kubenswrapper[4775]: I1216 15:55:31.819454 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9wdz\" (UniqueName: \"kubernetes.io/projected/c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2-kube-api-access-g9wdz\") pod \"crc-debug-r7l6g\" (UID: \"c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2\") " pod="openshift-must-gather-kqrn5/crc-debug-r7l6g" Dec 16 15:55:31 crc kubenswrapper[4775]: I1216 15:55:31.973363 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqrn5/crc-debug-r7l6g" Dec 16 15:55:32 crc kubenswrapper[4775]: W1216 15:55:32.008756 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc29df4d2_4bcd_4d8f_9ed1_b732ac615cd2.slice/crio-09e9e54628d4fab96ce247f8ee47d92070b353825e69b54db1d70329429b077d WatchSource:0}: Error finding container 09e9e54628d4fab96ce247f8ee47d92070b353825e69b54db1d70329429b077d: Status 404 returned error can't find the container with id 09e9e54628d4fab96ce247f8ee47d92070b353825e69b54db1d70329429b077d Dec 16 15:55:32 crc kubenswrapper[4775]: I1216 15:55:32.322434 4775 generic.go:334] "Generic (PLEG): container finished" podID="c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2" containerID="4ed93c71818ef43a2f76376cd8da8b653d583725add8e3922a387e83aa39a13a" exitCode=0 Dec 16 15:55:32 crc kubenswrapper[4775]: I1216 15:55:32.322553 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqrn5/crc-debug-r7l6g" event={"ID":"c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2","Type":"ContainerDied","Data":"4ed93c71818ef43a2f76376cd8da8b653d583725add8e3922a387e83aa39a13a"} Dec 16 15:55:32 crc kubenswrapper[4775]: I1216 15:55:32.323509 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqrn5/crc-debug-r7l6g" event={"ID":"c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2","Type":"ContainerStarted","Data":"09e9e54628d4fab96ce247f8ee47d92070b353825e69b54db1d70329429b077d"} Dec 16 15:55:32 crc kubenswrapper[4775]: I1216 15:55:32.361178 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kqrn5/crc-debug-r7l6g"] Dec 16 15:55:32 crc kubenswrapper[4775]: I1216 15:55:32.369801 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kqrn5/crc-debug-r7l6g"] Dec 16 15:55:33 crc kubenswrapper[4775]: I1216 15:55:33.448938 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqrn5/crc-debug-r7l6g" Dec 16 15:55:33 crc kubenswrapper[4775]: I1216 15:55:33.529824 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2-host\") pod \"c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2\" (UID: \"c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2\") " Dec 16 15:55:33 crc kubenswrapper[4775]: I1216 15:55:33.529906 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9wdz\" (UniqueName: \"kubernetes.io/projected/c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2-kube-api-access-g9wdz\") pod \"c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2\" (UID: \"c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2\") " Dec 16 15:55:33 crc kubenswrapper[4775]: I1216 15:55:33.529927 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2-host" (OuterVolumeSpecName: "host") pod "c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2" (UID: "c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:55:33 crc kubenswrapper[4775]: I1216 15:55:33.530173 4775 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2-host\") on node \"crc\" DevicePath \"\"" Dec 16 15:55:33 crc kubenswrapper[4775]: I1216 15:55:33.535597 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2-kube-api-access-g9wdz" (OuterVolumeSpecName: "kube-api-access-g9wdz") pod "c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2" (UID: "c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2"). InnerVolumeSpecName "kube-api-access-g9wdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:55:33 crc kubenswrapper[4775]: I1216 15:55:33.631798 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9wdz\" (UniqueName: \"kubernetes.io/projected/c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2-kube-api-access-g9wdz\") on node \"crc\" DevicePath \"\"" Dec 16 15:55:34 crc kubenswrapper[4775]: I1216 15:55:34.342945 4775 scope.go:117] "RemoveContainer" containerID="4ed93c71818ef43a2f76376cd8da8b653d583725add8e3922a387e83aa39a13a" Dec 16 15:55:34 crc kubenswrapper[4775]: I1216 15:55:34.343240 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqrn5/crc-debug-r7l6g" Dec 16 15:55:35 crc kubenswrapper[4775]: I1216 15:55:35.350435 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2" path="/var/lib/kubelet/pods/c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2/volumes" Dec 16 15:55:48 crc kubenswrapper[4775]: I1216 15:55:48.037999 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5cdf7cd694-pv7bs_38f1f660-5367-4db0-a653-c72807682175/barbican-api/0.log" Dec 16 15:55:48 crc kubenswrapper[4775]: I1216 15:55:48.189110 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5cdf7cd694-pv7bs_38f1f660-5367-4db0-a653-c72807682175/barbican-api-log/0.log" Dec 16 15:55:48 crc kubenswrapper[4775]: I1216 15:55:48.250299 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c8466bf58-6vkrk_ee4d6d93-229d-499b-8121-123db79d7758/barbican-keystone-listener/0.log" Dec 16 15:55:48 crc kubenswrapper[4775]: I1216 15:55:48.406273 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c8466bf58-6vkrk_ee4d6d93-229d-499b-8121-123db79d7758/barbican-keystone-listener-log/0.log" Dec 16 15:55:48 crc kubenswrapper[4775]: I1216 15:55:48.475482 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d485fb59c-wh26t_1035b49f-bf1a-44ee-9cd4-01df93145086/barbican-worker/0.log" Dec 16 15:55:48 crc kubenswrapper[4775]: I1216 15:55:48.488350 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d485fb59c-wh26t_1035b49f-bf1a-44ee-9cd4-01df93145086/barbican-worker-log/0.log" Dec 16 15:55:48 crc kubenswrapper[4775]: I1216 15:55:48.679447 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f_2b2d1ae7-ec42-4c6c-9400-966f2093d883/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:55:48 crc kubenswrapper[4775]: I1216 15:55:48.742067 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81d44d7a-71b0-40da-b940-ccdb6d63b4f9/ceilometer-central-agent/0.log" Dec 16 15:55:48 crc kubenswrapper[4775]: I1216 15:55:48.858442 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81d44d7a-71b0-40da-b940-ccdb6d63b4f9/ceilometer-notification-agent/0.log" Dec 16 15:55:48 crc kubenswrapper[4775]: I1216 15:55:48.868052 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81d44d7a-71b0-40da-b940-ccdb6d63b4f9/proxy-httpd/0.log" Dec 16 15:55:48 crc kubenswrapper[4775]: I1216 15:55:48.899782 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81d44d7a-71b0-40da-b940-ccdb6d63b4f9/sg-core/0.log" Dec 16 15:55:49 crc kubenswrapper[4775]: I1216 15:55:49.066083 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_521e4d08-bfb5-4043-bc0f-7515dbeb467f/cinder-api-log/0.log" Dec 16 15:55:49 crc kubenswrapper[4775]: I1216 15:55:49.112581 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_521e4d08-bfb5-4043-bc0f-7515dbeb467f/cinder-api/0.log" Dec 16 15:55:49 crc kubenswrapper[4775]: I1216 15:55:49.191406 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fc19d8c3-9cbc-45db-ad19-ab8a38792218/cinder-scheduler/0.log" Dec 16 15:55:49 crc kubenswrapper[4775]: I1216 15:55:49.313617 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fc19d8c3-9cbc-45db-ad19-ab8a38792218/probe/0.log" Dec 16 15:55:49 crc kubenswrapper[4775]: I1216 15:55:49.383566 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-76qtc_14cad095-639f-4735-8e83-d5a2abd771c3/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:55:49 crc kubenswrapper[4775]: I1216 15:55:49.533314 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7_410c8945-6eac-4dd6-943b-a2024de59d58/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:55:49 crc kubenswrapper[4775]: I1216 15:55:49.614152 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-f8mml_cb5b019a-c088-4515-91e1-a110d1ee04c9/init/0.log" Dec 16 15:55:49 crc kubenswrapper[4775]: I1216 15:55:49.762120 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-f8mml_cb5b019a-c088-4515-91e1-a110d1ee04c9/init/0.log" Dec 16 15:55:49 crc kubenswrapper[4775]: I1216 15:55:49.766247 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-f8mml_cb5b019a-c088-4515-91e1-a110d1ee04c9/dnsmasq-dns/0.log" Dec 16 15:55:49 crc kubenswrapper[4775]: I1216 15:55:49.849332 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp_32966f09-4e16-4fcb-925e-edb1c957cea1/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:55:49 crc kubenswrapper[4775]: I1216 15:55:49.995423 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7732fee4-0518-41db-be31-b9c7ae4aca6b/glance-httpd/0.log" Dec 16 15:55:50 crc kubenswrapper[4775]: I1216 15:55:50.063639 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7732fee4-0518-41db-be31-b9c7ae4aca6b/glance-log/0.log" Dec 16 15:55:50 crc kubenswrapper[4775]: I1216 15:55:50.187239 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0e61daa0-8bea-4632-8936-5fb68d555ab1/glance-httpd/0.log" Dec 16 15:55:50 crc kubenswrapper[4775]: I1216 15:55:50.235932 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0e61daa0-8bea-4632-8936-5fb68d555ab1/glance-log/0.log" Dec 16 15:55:50 crc kubenswrapper[4775]: I1216 15:55:50.804800 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-59fcc7f56d-krpcl_262d5cc2-3677-4f62-aa93-60ccab4cf899/heat-engine/0.log" Dec 16 15:55:50 crc kubenswrapper[4775]: I1216 15:55:50.832282 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-69676fb7c9-tmm27_d9a8d05d-1353-46db-9367-c7205a7d39d9/heat-cfnapi/0.log" Dec 16 15:55:50 crc kubenswrapper[4775]: I1216 15:55:50.863700 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-55844f6789-qwjbq_ea336475-3963-43eb-9e16-814d0c717625/heat-api/0.log" Dec 16 15:55:50 crc kubenswrapper[4775]: I1216 15:55:50.986071 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-mxktb_5f07dc8f-f161-4826-b191-4344f1b741e0/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:55:51 crc kubenswrapper[4775]: I1216 15:55:51.038825 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-lv5v6_c99310e2-070c-4bed-b14d-850dfd069353/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:55:51 crc kubenswrapper[4775]: I1216 15:55:51.312607 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_44120d84-ab08-40cb-ad82-59518b6f55b2/kube-state-metrics/0.log" Dec 16 15:55:51 crc kubenswrapper[4775]: I1216 15:55:51.550417 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9_0570786e-5fec-43cf-b7ec-12a4facea06d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:55:51 crc kubenswrapper[4775]: I1216 15:55:51.565371 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-779ff79b57-nb7bt_2e88e1b8-8837-49a6-9769-ddab7adfb812/keystone-api/0.log" Dec 16 15:55:51 crc kubenswrapper[4775]: I1216 15:55:51.976230 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-69cbb5df9f-wmvhj_66ed3d02-9c61-43a8-90bf-35d00458d088/neutron-httpd/0.log" Dec 16 15:55:52 crc kubenswrapper[4775]: I1216 15:55:52.002488 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-69cbb5df9f-wmvhj_66ed3d02-9c61-43a8-90bf-35d00458d088/neutron-api/0.log" Dec 16 15:55:52 crc kubenswrapper[4775]: I1216 15:55:52.050416 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh_9a992ef8-ad46-4e3a-a98a-dc75ad484c7e/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:55:52 crc kubenswrapper[4775]: I1216 15:55:52.577742 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_22a00983-b0df-4afb-bbc2-2f7da7c8c05e/nova-api-log/0.log" Dec 16 15:55:52 crc kubenswrapper[4775]: I1216 15:55:52.668768 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_af50c3ce-5c89-46eb-bd8c-83346b17ad3d/nova-cell0-conductor-conductor/0.log" Dec 16 15:55:52 crc kubenswrapper[4775]: I1216 15:55:52.679112 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_22a00983-b0df-4afb-bbc2-2f7da7c8c05e/nova-api-api/0.log" Dec 16 15:55:52 crc kubenswrapper[4775]: I1216 15:55:52.813102 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b19dccbe-2434-48ae-8822-1ced3b7167c7/nova-cell1-conductor-conductor/0.log" Dec 16 15:55:52 crc kubenswrapper[4775]: I1216 15:55:52.970045 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_2451285e-33a6-42ca-b8f9-336131211c7b/nova-cell1-novncproxy-novncproxy/0.log" Dec 16 15:55:53 crc kubenswrapper[4775]: I1216 15:55:53.082988 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-nnz9f_e3ac9c58-f9b2-4b76-baec-dc50c94c8185/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:55:53 crc kubenswrapper[4775]: I1216 15:55:53.206808 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9e94b8ac-6213-42f6-94ff-7e42e358fcf9/nova-metadata-log/0.log" Dec 16 15:55:53 crc kubenswrapper[4775]: I1216 15:55:53.483970 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f991d67b-2c42-4f93-aacb-3486ea1e43a8/nova-scheduler-scheduler/0.log" Dec 16 15:55:53 crc kubenswrapper[4775]: I1216 15:55:53.546284 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e109503b-1619-4659-956c-24c58c0011a6/mysql-bootstrap/0.log" Dec 16 15:55:53 crc kubenswrapper[4775]: I1216 15:55:53.787638 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e109503b-1619-4659-956c-24c58c0011a6/mysql-bootstrap/0.log" Dec 16 15:55:53 crc kubenswrapper[4775]: I1216 15:55:53.800914 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e109503b-1619-4659-956c-24c58c0011a6/galera/0.log" Dec 16 15:55:53 crc kubenswrapper[4775]: I1216 15:55:53.965494 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2baed48f-c5f4-4126-b0ed-403a38b18c00/mysql-bootstrap/0.log" Dec 16 15:55:54 crc kubenswrapper[4775]: I1216 15:55:54.222983 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2baed48f-c5f4-4126-b0ed-403a38b18c00/mysql-bootstrap/0.log" Dec 16 15:55:54 crc kubenswrapper[4775]: I1216 15:55:54.239758 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2baed48f-c5f4-4126-b0ed-403a38b18c00/galera/0.log" Dec 16 15:55:54 crc kubenswrapper[4775]: I1216 15:55:54.364775 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a9f81b8a-3b7e-4984-946f-2de17873b97a/openstackclient/0.log" Dec 16 15:55:54 crc kubenswrapper[4775]: I1216 15:55:54.376263 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9e94b8ac-6213-42f6-94ff-7e42e358fcf9/nova-metadata-metadata/0.log" Dec 16 15:55:54 crc kubenswrapper[4775]: I1216 15:55:54.510743 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nwdtm_096c5279-0aa8-4641-8b5f-66e41869ec98/openstack-network-exporter/0.log" Dec 16 15:55:54 crc kubenswrapper[4775]: I1216 15:55:54.634695 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-c5f9m_a42d9c48-0f56-4f2d-8c54-8baebeca09ea/ovsdb-server-init/0.log" Dec 16 15:55:54 crc kubenswrapper[4775]: I1216 15:55:54.864240 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-c5f9m_a42d9c48-0f56-4f2d-8c54-8baebeca09ea/ovsdb-server-init/0.log" Dec 16 15:55:54 crc kubenswrapper[4775]: I1216 15:55:54.886261 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-c5f9m_a42d9c48-0f56-4f2d-8c54-8baebeca09ea/ovs-vswitchd/0.log" Dec 16 15:55:54 crc kubenswrapper[4775]: I1216 15:55:54.932019 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-c5f9m_a42d9c48-0f56-4f2d-8c54-8baebeca09ea/ovsdb-server/0.log" Dec 16 15:55:55 crc kubenswrapper[4775]: I1216 15:55:55.077704 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-rkmmt_b560f177-aa8d-4722-92bd-4ef2755caab0/ovn-controller/0.log" Dec 16 15:55:55 crc kubenswrapper[4775]: I1216 15:55:55.148580 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fdz4b_0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:55:55 crc kubenswrapper[4775]: I1216 15:55:55.323590 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_470f973b-96da-437e-a5ce-e53dbadd9276/ovn-northd/0.log" Dec 16 15:55:55 crc kubenswrapper[4775]: I1216 15:55:55.372836 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_470f973b-96da-437e-a5ce-e53dbadd9276/openstack-network-exporter/0.log" Dec 16 15:55:55 crc kubenswrapper[4775]: I1216 15:55:55.509792 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8b7b212f-2aa6-4fc0-a864-6cd8f1943b71/openstack-network-exporter/0.log" Dec 16 15:55:55 crc kubenswrapper[4775]: I1216 15:55:55.540402 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8b7b212f-2aa6-4fc0-a864-6cd8f1943b71/ovsdbserver-nb/0.log" Dec 16 15:55:55 crc kubenswrapper[4775]: I1216 15:55:55.668955 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e5650e0a-bb07-4cce-872c-772038c2ae56/openstack-network-exporter/0.log" Dec 16 15:55:55 crc kubenswrapper[4775]: I1216 15:55:55.743744 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e5650e0a-bb07-4cce-872c-772038c2ae56/ovsdbserver-sb/0.log" Dec 16 15:55:55 crc kubenswrapper[4775]: I1216 15:55:55.868555 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-c66959c54-2xm6x_c1a21b6b-9081-4060-8bc8-566c2a60bde6/placement-api/0.log" Dec 16 15:55:55 crc kubenswrapper[4775]: I1216 15:55:55.960513 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-c66959c54-2xm6x_c1a21b6b-9081-4060-8bc8-566c2a60bde6/placement-log/0.log" Dec 16 15:55:56 crc kubenswrapper[4775]: I1216 15:55:56.041445 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cf004dca-5d2e-4e4d-9c29-66b076fcc406/setup-container/0.log" Dec 16 15:55:56 crc kubenswrapper[4775]: I1216 15:55:56.266381 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cf004dca-5d2e-4e4d-9c29-66b076fcc406/setup-container/0.log" Dec 16 15:55:56 crc kubenswrapper[4775]: I1216 15:55:56.266756 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ba99f865-7192-4da9-8575-62d54a66d82e/setup-container/0.log" Dec 16 15:55:56 crc kubenswrapper[4775]: I1216 15:55:56.282004 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cf004dca-5d2e-4e4d-9c29-66b076fcc406/rabbitmq/0.log" Dec 16 15:55:56 crc kubenswrapper[4775]: I1216 15:55:56.477589 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ba99f865-7192-4da9-8575-62d54a66d82e/setup-container/0.log" Dec 16 15:55:56 crc kubenswrapper[4775]: I1216 15:55:56.535646 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ba99f865-7192-4da9-8575-62d54a66d82e/rabbitmq/0.log" Dec 16 15:55:56 crc kubenswrapper[4775]: I1216 15:55:56.604216 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f_dab0db60-d31f-4e9d-b17e-5dea1fdc90cb/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:55:56 crc kubenswrapper[4775]: I1216 15:55:56.790243 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-r5l7p_586695ef-512d-4d00-b127-751849932aef/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:55:56 crc kubenswrapper[4775]: I1216 15:55:56.856818 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5_023c8812-4f2e-4b64-85c7-eabd4ed3d7f9/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:55:57 crc kubenswrapper[4775]: I1216 15:55:57.288324 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-tnb9k_e0ba352c-17c3-4c36-b409-83485c265668/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:55:57 crc kubenswrapper[4775]: I1216 15:55:57.295750 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-jf2hn_67d72872-cd76-413f-bcbe-e0c6da3a8f5a/ssh-known-hosts-edpm-deployment/0.log" Dec 16 15:55:57 crc kubenswrapper[4775]: I1216 15:55:57.498800 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-78c6fdf4b7-xxfgx_15f3da25-9cb6-406e-b022-935c6201ea4a/proxy-server/0.log" Dec 16 15:55:57 crc kubenswrapper[4775]: I1216 15:55:57.672627 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-tp2tw_edd66213-7818-408d-a6ec-73c6e3b39321/swift-ring-rebalance/0.log" Dec 16 15:55:57 crc kubenswrapper[4775]: I1216 15:55:57.692901 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-78c6fdf4b7-xxfgx_15f3da25-9cb6-406e-b022-935c6201ea4a/proxy-httpd/0.log" Dec 16 15:55:57 crc kubenswrapper[4775]: I1216 15:55:57.781223 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/account-auditor/0.log" Dec 16 15:55:57 crc kubenswrapper[4775]: I1216 15:55:57.920119 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/account-server/0.log" Dec 16 15:55:57 crc kubenswrapper[4775]: I1216 15:55:57.935070 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/account-reaper/0.log" Dec 16 15:55:57 crc kubenswrapper[4775]: I1216 15:55:57.966225 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/container-auditor/0.log" Dec 16 15:55:57 crc kubenswrapper[4775]: I1216 15:55:57.968384 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/account-replicator/0.log" Dec 16 15:55:58 crc kubenswrapper[4775]: I1216 15:55:58.152552 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/object-auditor/0.log" Dec 16 15:55:58 crc kubenswrapper[4775]: I1216 15:55:58.167515 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/container-server/0.log" Dec 16 15:55:58 crc kubenswrapper[4775]: I1216 15:55:58.175228 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/container-replicator/0.log" Dec 16 15:55:58 crc kubenswrapper[4775]: I1216 15:55:58.177701 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/container-updater/0.log" Dec 16 15:55:58 crc kubenswrapper[4775]: I1216 15:55:58.326663 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/object-expirer/0.log" Dec 16 15:55:58 crc kubenswrapper[4775]: I1216 15:55:58.376606 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/object-replicator/0.log" Dec 16 15:55:58 crc kubenswrapper[4775]: I1216 15:55:58.397071 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/object-updater/0.log" Dec 16 15:55:58 crc kubenswrapper[4775]: I1216 15:55:58.407564 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/object-server/0.log" Dec 16 15:55:58 crc kubenswrapper[4775]: I1216 15:55:58.546452 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/rsync/0.log" Dec 16 15:55:58 crc kubenswrapper[4775]: I1216 15:55:58.625381 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/swift-recon-cron/0.log" Dec 16 15:55:58 crc kubenswrapper[4775]: I1216 15:55:58.760838 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-9n929_533cc620-42ce-4262-bcfe-25c8ebe74ff6/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:55:58 crc kubenswrapper[4775]: I1216 15:55:58.841922 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_81e92dde-6675-4a19-a619-52358e91c49c/tempest-tests-tempest-tests-runner/0.log" Dec 16 15:55:58 crc kubenswrapper[4775]: I1216 15:55:58.934246 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_974b82d4-0fe0-449c-89d3-619ac869f974/test-operator-logs-container/0.log" Dec 16 15:55:59 crc kubenswrapper[4775]: I1216 15:55:59.075012 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4_55b74b45-197a-47f8-88cf-ce675418f3ca/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:56:06 crc kubenswrapper[4775]: I1216 15:56:06.703061 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f8bdb272-4c39-4532-926a-f3dcc70af374/memcached/0.log" Dec 16 15:56:22 crc kubenswrapper[4775]: I1216 15:56:22.922522 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-95949466-9m4t8_03a9286d-3fd3-4ec6-9a1d-fb8d613f401e/manager/0.log" Dec 16 15:56:23 crc kubenswrapper[4775]: I1216 15:56:23.121712 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt_16deb1c1-d3c3-46d3-b565-30ef1773f202/util/0.log" Dec 16 15:56:23 crc kubenswrapper[4775]: I1216 15:56:23.292534 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt_16deb1c1-d3c3-46d3-b565-30ef1773f202/pull/0.log" Dec 16 15:56:23 crc kubenswrapper[4775]: I1216 15:56:23.326751 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt_16deb1c1-d3c3-46d3-b565-30ef1773f202/util/0.log" Dec 16 15:56:23 crc kubenswrapper[4775]: I1216 15:56:23.328607 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt_16deb1c1-d3c3-46d3-b565-30ef1773f202/pull/0.log" Dec 16 15:56:23 crc kubenswrapper[4775]: I1216 15:56:23.502390 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt_16deb1c1-d3c3-46d3-b565-30ef1773f202/extract/0.log" Dec 16 15:56:23 crc kubenswrapper[4775]: I1216 15:56:23.504638 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt_16deb1c1-d3c3-46d3-b565-30ef1773f202/pull/0.log" Dec 16 15:56:23 crc kubenswrapper[4775]: I1216 15:56:23.557434 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt_16deb1c1-d3c3-46d3-b565-30ef1773f202/util/0.log" Dec 16 15:56:23 crc kubenswrapper[4775]: I1216 15:56:23.705121 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5f98b4754f-5gxtk_e002ee65-47de-44d4-864e-531283c322f7/manager/0.log" Dec 16 15:56:23 crc kubenswrapper[4775]: I1216 15:56:23.738794 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66f8b87655-dfdrn_d0dab2aa-577b-4a9d-bcce-0530cbb3e4b6/manager/0.log" Dec 16 15:56:23 crc kubenswrapper[4775]: I1216 15:56:23.951915 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-767f9d7567-dqpmk_8e002a19-f4ca-4186-940c-321834e88e5e/manager/0.log" Dec 16 15:56:24 crc kubenswrapper[4775]: I1216 15:56:24.024290 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5559d9665f-4hmbr_c5962fcc-3c3b-435a-b848-237af19ce258/manager/0.log" Dec 16 15:56:24 crc kubenswrapper[4775]: I1216 15:56:24.152271 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6ccf486b9-pk5fg_a738c781-0876-490f-bf95-d7d77a6f2aff/manager/0.log" Dec 16 15:56:24 crc kubenswrapper[4775]: I1216 15:56:24.372621 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-f458558d7-9b8tb_d8873d69-8f0e-4816-b39e-bf8506282196/manager/0.log" Dec 16 15:56:24 crc kubenswrapper[4775]: I1216 15:56:24.435657 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6558fdd56c-jc4nj_be824423-7753-4920-8aa7-93d2904280fb/manager/0.log" Dec 16 15:56:24 crc kubenswrapper[4775]: I1216 15:56:24.535838 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5c7cbf548f-lph76_eff249fe-7aa9-406b-a4f0-91d7891afc8b/manager/0.log" Dec 16 15:56:24 crc kubenswrapper[4775]: I1216 15:56:24.649944 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5fdd9786f7-gcfkm_f82f14a2-7460-4a06-978b-d22d9ad7d6bd/manager/0.log" Dec 16 15:56:24 crc kubenswrapper[4775]: I1216 15:56:24.716854 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f76f4954c-47s9s_4fbf17e0-d42f-463b-9f01-a39d842812ff/manager/0.log" Dec 16 15:56:24 crc kubenswrapper[4775]: I1216 15:56:24.832676 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7cd87b778f-qmqgx_1723eb19-5ef2-43d0-a1f8-590e89eb5f87/manager/0.log" Dec 16 15:56:24 crc kubenswrapper[4775]: I1216 15:56:24.979496 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5fbbf8b6cc-nvb99_63c035e4-8ff2-49a4-94d9-57c65a71494b/manager/0.log" Dec 16 15:56:25 crc kubenswrapper[4775]: I1216 15:56:25.068165 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-68c649d9d-nj5th_11012716-6e3c-4b17-97c7-16e723ad1092/manager/0.log" Dec 16 15:56:25 crc kubenswrapper[4775]: I1216 15:56:25.168241 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb_275767d8-4eed-4a90-8d43-348c607ee37e/manager/0.log" Dec 16 15:56:25 crc kubenswrapper[4775]: I1216 15:56:25.613965 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-nxcc2_16d64d82-cfc5-461a-a39a-48fd77562a54/registry-server/0.log" Dec 16 15:56:25 crc kubenswrapper[4775]: I1216 15:56:25.672756 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6d499bd55-lnqxb_480fe07d-8bd7-4879-bb80-ceb5f0baf2cb/operator/0.log" Dec 16 15:56:25 crc kubenswrapper[4775]: I1216 15:56:25.907565 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bf6d4f946-d9rg9_b70a54b3-3bc0-45e4-add9-d47b81371266/manager/0.log" Dec 16 15:56:26 crc kubenswrapper[4775]: I1216 15:56:26.081408 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8665b56d78-7fmnw_19d1c138-c230-44b2-972c-c557693054f5/manager/0.log" Dec 16 15:56:26 crc kubenswrapper[4775]: I1216 15:56:26.285695 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xd48q_d132ccba-b1e9-4f8c-8129-1087a1a672b9/operator/0.log" Dec 16 15:56:26 crc kubenswrapper[4775]: I1216 15:56:26.467144 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5c6df8f9-r8p6v_85cc53cf-83a7-4810-b0fc-7317f9327c09/manager/0.log" Dec 16 15:56:26 crc kubenswrapper[4775]: I1216 15:56:26.543679 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7bc9b98d8-rvdbc_bd6aff58-984e-4106-acb0-c689f6e31832/manager/0.log" Dec 16 15:56:26 crc kubenswrapper[4775]: I1216 15:56:26.658596 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-97d456b9-d2kbz_2f9a8b75-2e17-43ce-be88-dbc6f7ec0cb1/manager/0.log" Dec 16 15:56:26 crc kubenswrapper[4775]: I1216 15:56:26.736181 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-756ccf86c7-4mncx_f05c78d5-d86c-42de-9eee-e8d09204a0b4/manager/0.log" Dec 16 15:56:26 crc kubenswrapper[4775]: I1216 15:56:26.851671 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-55f78b7c4c-bl86c_14102b10-a3ba-4f16-9928-4f41426a435f/manager/0.log" Dec 16 15:56:44 crc kubenswrapper[4775]: I1216 15:56:44.358469 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2tkrk_a492f5f7-b613-4e56-8071-78f8c836e7c3/control-plane-machine-set-operator/0.log" Dec 16 15:56:44 crc kubenswrapper[4775]: I1216 15:56:44.510773 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5fjmz_e1a2834e-159c-47f0-81a8-87d37d89a22a/kube-rbac-proxy/0.log" Dec 16 15:56:44 crc kubenswrapper[4775]: I1216 15:56:44.522224 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5fjmz_e1a2834e-159c-47f0-81a8-87d37d89a22a/machine-api-operator/0.log" Dec 16 15:56:56 crc kubenswrapper[4775]: I1216 15:56:56.419835 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-kh9z9_4fc14e4b-fa58-41f3-b5b4-f27d75e6a294/cert-manager-controller/0.log" Dec 16 15:56:56 crc kubenswrapper[4775]: I1216 15:56:56.513210 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-s2cqf_5aa53da3-90be-4e8d-874f-817fce504026/cert-manager-cainjector/0.log" Dec 16 15:56:56 crc kubenswrapper[4775]: I1216 15:56:56.607059 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-gxr9j_370c0803-3050-431b-82e2-d3d69f5d386f/cert-manager-webhook/0.log" Dec 16 15:57:08 crc kubenswrapper[4775]: I1216 15:57:08.194954 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6ff7998486-6842d_a471fecb-d3ef-427f-a02c-30a00b513bae/nmstate-console-plugin/0.log" Dec 16 15:57:08 crc kubenswrapper[4775]: I1216 15:57:08.420218 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-7hgfg_6384fd2d-45e1-421e-920f-5555dc0f8a10/kube-rbac-proxy/0.log" Dec 16 15:57:08 crc kubenswrapper[4775]: I1216 15:57:08.432863 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-sc4rw_5c66735d-0eb0-46a8-b2db-f65158873132/nmstate-handler/0.log" Dec 16 15:57:08 crc kubenswrapper[4775]: I1216 15:57:08.574742 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6769fb99d-kkdbg_0318b125-3608-48d1-b19f-8fcad1785fa8/nmstate-operator/0.log" Dec 16 15:57:08 crc kubenswrapper[4775]: I1216 15:57:08.597220 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-7hgfg_6384fd2d-45e1-421e-920f-5555dc0f8a10/nmstate-metrics/0.log" Dec 16 15:57:08 crc kubenswrapper[4775]: I1216 15:57:08.765249 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-f8fb84555-4p9vw_52cbae70-fde7-47d8-a118-799f6fb64f2b/nmstate-webhook/0.log" Dec 16 15:57:23 crc kubenswrapper[4775]: I1216 15:57:23.408623 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-khxxp_aa78cbdc-f63e-4010-9bdc-88715f997591/kube-rbac-proxy/0.log" Dec 16 15:57:23 crc kubenswrapper[4775]: I1216 15:57:23.437506 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-khxxp_aa78cbdc-f63e-4010-9bdc-88715f997591/controller/0.log" Dec 16 15:57:23 crc kubenswrapper[4775]: I1216 15:57:23.624956 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-frr-files/0.log" Dec 16 15:57:23 crc kubenswrapper[4775]: I1216 15:57:23.828977 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-frr-files/0.log" Dec 16 15:57:23 crc kubenswrapper[4775]: I1216 15:57:23.845571 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-metrics/0.log" Dec 16 15:57:23 crc kubenswrapper[4775]: I1216 15:57:23.845590 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-reloader/0.log" Dec 16 15:57:23 crc kubenswrapper[4775]: I1216 15:57:23.894713 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-reloader/0.log" Dec 16 15:57:24 crc kubenswrapper[4775]: I1216 15:57:24.063542 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-frr-files/0.log" Dec 16 15:57:24 crc kubenswrapper[4775]: I1216 15:57:24.066217 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-reloader/0.log" Dec 16 15:57:24 crc kubenswrapper[4775]: I1216 15:57:24.115333 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-metrics/0.log" Dec 16 15:57:24 crc kubenswrapper[4775]: I1216 15:57:24.137133 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-metrics/0.log" Dec 16 15:57:24 crc kubenswrapper[4775]: I1216 15:57:24.335400 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-metrics/0.log" Dec 16 15:57:24 crc kubenswrapper[4775]: I1216 15:57:24.338674 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/controller/0.log" Dec 16 15:57:24 crc kubenswrapper[4775]: I1216 15:57:24.344781 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-frr-files/0.log" Dec 16 15:57:24 crc kubenswrapper[4775]: I1216 15:57:24.385316 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-reloader/0.log" Dec 16 15:57:24 crc kubenswrapper[4775]: I1216 15:57:24.555827 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/frr-metrics/0.log" Dec 16 15:57:24 crc kubenswrapper[4775]: I1216 15:57:24.562994 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/kube-rbac-proxy/0.log" Dec 16 15:57:24 crc kubenswrapper[4775]: I1216 15:57:24.606353 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/kube-rbac-proxy-frr/0.log" Dec 16 15:57:24 crc kubenswrapper[4775]: I1216 15:57:24.802129 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/reloader/0.log" Dec 16 15:57:24 crc kubenswrapper[4775]: I1216 15:57:24.831858 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-6hfpf_b55d23ff-e3e6-460c-8058-0489204c8a4d/frr-k8s-webhook-server/0.log" Dec 16 15:57:25 crc kubenswrapper[4775]: I1216 15:57:25.026296 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6dbcb5f69b-g6llk_4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd/manager/0.log" Dec 16 15:57:25 crc kubenswrapper[4775]: I1216 15:57:25.221945 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7889b8b87-lgnbs_e79447d0-f855-4f85-a021-0618e819f822/webhook-server/0.log" Dec 16 15:57:25 crc kubenswrapper[4775]: I1216 15:57:25.276962 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hzvbb_2516e125-5678-4a01-8a6b-1f8865b69f77/kube-rbac-proxy/0.log" Dec 16 15:57:25 crc kubenswrapper[4775]: I1216 15:57:25.869309 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/frr/0.log" Dec 16 15:57:25 crc kubenswrapper[4775]: I1216 15:57:25.889825 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hzvbb_2516e125-5678-4a01-8a6b-1f8865b69f77/speaker/0.log" Dec 16 15:57:32 crc kubenswrapper[4775]: I1216 15:57:32.869252 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:57:32 crc kubenswrapper[4775]: I1216 15:57:32.869790 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:57:38 crc kubenswrapper[4775]: I1216 15:57:38.209110 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66_50c7dffe-e977-448f-bcdd-7a68df1cefca/util/0.log" Dec 16 15:57:38 crc kubenswrapper[4775]: I1216 15:57:38.361704 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66_50c7dffe-e977-448f-bcdd-7a68df1cefca/util/0.log" Dec 16 15:57:38 crc kubenswrapper[4775]: I1216 15:57:38.363923 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66_50c7dffe-e977-448f-bcdd-7a68df1cefca/pull/0.log" Dec 16 15:57:38 crc kubenswrapper[4775]: I1216 15:57:38.423006 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66_50c7dffe-e977-448f-bcdd-7a68df1cefca/pull/0.log" Dec 16 15:57:38 crc kubenswrapper[4775]: I1216 15:57:38.572895 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66_50c7dffe-e977-448f-bcdd-7a68df1cefca/pull/0.log" Dec 16 15:57:38 crc kubenswrapper[4775]: I1216 15:57:38.604927 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66_50c7dffe-e977-448f-bcdd-7a68df1cefca/extract/0.log" Dec 16 15:57:38 crc kubenswrapper[4775]: I1216 15:57:38.611593 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66_50c7dffe-e977-448f-bcdd-7a68df1cefca/util/0.log" Dec 16 15:57:38 crc kubenswrapper[4775]: I1216 15:57:38.783226 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx_739f7090-9a46-4ae3-a85b-045a2b1e197d/util/0.log" Dec 16 15:57:38 crc kubenswrapper[4775]: I1216 15:57:38.898302 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx_739f7090-9a46-4ae3-a85b-045a2b1e197d/util/0.log" Dec 16 15:57:38 crc kubenswrapper[4775]: I1216 15:57:38.926315 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx_739f7090-9a46-4ae3-a85b-045a2b1e197d/pull/0.log" Dec 16 15:57:38 crc kubenswrapper[4775]: I1216 15:57:38.941405 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx_739f7090-9a46-4ae3-a85b-045a2b1e197d/pull/0.log" Dec 16 15:57:39 crc kubenswrapper[4775]: I1216 15:57:39.079342 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx_739f7090-9a46-4ae3-a85b-045a2b1e197d/util/0.log" Dec 16 15:57:39 crc kubenswrapper[4775]: I1216 15:57:39.106330 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx_739f7090-9a46-4ae3-a85b-045a2b1e197d/pull/0.log" Dec 16 15:57:39 crc kubenswrapper[4775]: I1216 15:57:39.118779 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx_739f7090-9a46-4ae3-a85b-045a2b1e197d/extract/0.log" Dec 16 15:57:39 crc kubenswrapper[4775]: I1216 15:57:39.262042 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dzt2q_2be58473-7d1b-4c58-a3a7-862cd4846f63/extract-utilities/0.log" Dec 16 15:57:39 crc kubenswrapper[4775]: I1216 15:57:39.475124 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dzt2q_2be58473-7d1b-4c58-a3a7-862cd4846f63/extract-content/0.log" Dec 16 15:57:39 crc kubenswrapper[4775]: I1216 15:57:39.484875 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dzt2q_2be58473-7d1b-4c58-a3a7-862cd4846f63/extract-content/0.log" Dec 16 15:57:39 crc kubenswrapper[4775]: I1216 15:57:39.499487 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dzt2q_2be58473-7d1b-4c58-a3a7-862cd4846f63/extract-utilities/0.log" Dec 16 15:57:39 crc kubenswrapper[4775]: I1216 15:57:39.634619 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dzt2q_2be58473-7d1b-4c58-a3a7-862cd4846f63/extract-content/0.log" Dec 16 15:57:39 crc kubenswrapper[4775]: I1216 15:57:39.664096 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dzt2q_2be58473-7d1b-4c58-a3a7-862cd4846f63/extract-utilities/0.log" Dec 16 15:57:39 crc kubenswrapper[4775]: I1216 15:57:39.879189 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwsw2_37c67918-469b-4d46-aabb-63b96e941479/extract-utilities/0.log" Dec 16 15:57:40 crc kubenswrapper[4775]: I1216 15:57:40.090297 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwsw2_37c67918-469b-4d46-aabb-63b96e941479/extract-content/0.log" Dec 16 15:57:40 crc kubenswrapper[4775]: I1216 15:57:40.133554 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwsw2_37c67918-469b-4d46-aabb-63b96e941479/extract-utilities/0.log" Dec 16 15:57:40 crc kubenswrapper[4775]: I1216 15:57:40.146823 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwsw2_37c67918-469b-4d46-aabb-63b96e941479/extract-content/0.log" Dec 16 15:57:40 crc kubenswrapper[4775]: I1216 15:57:40.236782 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dzt2q_2be58473-7d1b-4c58-a3a7-862cd4846f63/registry-server/0.log" Dec 16 15:57:40 crc kubenswrapper[4775]: I1216 15:57:40.322803 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwsw2_37c67918-469b-4d46-aabb-63b96e941479/extract-utilities/0.log" Dec 16 15:57:40 crc kubenswrapper[4775]: I1216 15:57:40.350668 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwsw2_37c67918-469b-4d46-aabb-63b96e941479/extract-content/0.log" Dec 16 15:57:40 crc kubenswrapper[4775]: I1216 15:57:40.625910 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-76f84_7d608ef1-7f5b-45c5-80ce-f9be86cd93fe/marketplace-operator/0.log" Dec 16 15:57:40 crc kubenswrapper[4775]: I1216 15:57:40.698298 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-67zrm_59df28e1-27a5-451d-9784-a30eba2a3dc0/extract-utilities/0.log" Dec 16 15:57:40 crc kubenswrapper[4775]: I1216 15:57:40.938415 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-67zrm_59df28e1-27a5-451d-9784-a30eba2a3dc0/extract-content/0.log" Dec 16 15:57:41 crc kubenswrapper[4775]: I1216 15:57:41.036129 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-67zrm_59df28e1-27a5-451d-9784-a30eba2a3dc0/extract-utilities/0.log" Dec 16 15:57:41 crc kubenswrapper[4775]: I1216 15:57:41.039198 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-67zrm_59df28e1-27a5-451d-9784-a30eba2a3dc0/extract-content/0.log" Dec 16 15:57:41 crc kubenswrapper[4775]: I1216 15:57:41.053481 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwsw2_37c67918-469b-4d46-aabb-63b96e941479/registry-server/0.log" Dec 16 15:57:41 crc kubenswrapper[4775]: I1216 15:57:41.207076 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-67zrm_59df28e1-27a5-451d-9784-a30eba2a3dc0/extract-utilities/0.log" Dec 16 15:57:41 crc kubenswrapper[4775]: I1216 15:57:41.269808 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-67zrm_59df28e1-27a5-451d-9784-a30eba2a3dc0/extract-content/0.log" Dec 16 15:57:41 crc kubenswrapper[4775]: I1216 15:57:41.322545 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-67zrm_59df28e1-27a5-451d-9784-a30eba2a3dc0/registry-server/0.log" Dec 16 15:57:41 crc kubenswrapper[4775]: I1216 15:57:41.402644 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s28hs_e182acf8-e0f8-4ad4-b91f-0028568a79c3/extract-utilities/0.log" Dec 16 15:57:41 crc kubenswrapper[4775]: I1216 15:57:41.593584 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s28hs_e182acf8-e0f8-4ad4-b91f-0028568a79c3/extract-content/0.log" Dec 16 15:57:41 crc kubenswrapper[4775]: I1216 15:57:41.634617 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s28hs_e182acf8-e0f8-4ad4-b91f-0028568a79c3/extract-utilities/0.log" Dec 16 15:57:41 crc kubenswrapper[4775]: I1216 15:57:41.641957 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s28hs_e182acf8-e0f8-4ad4-b91f-0028568a79c3/extract-content/0.log" Dec 16 15:57:41 crc kubenswrapper[4775]: I1216 15:57:41.754793 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s28hs_e182acf8-e0f8-4ad4-b91f-0028568a79c3/extract-utilities/0.log" Dec 16 15:57:41 crc kubenswrapper[4775]: I1216 15:57:41.799634 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s28hs_e182acf8-e0f8-4ad4-b91f-0028568a79c3/extract-content/0.log" Dec 16 15:57:42 crc kubenswrapper[4775]: I1216 15:57:42.195152 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s28hs_e182acf8-e0f8-4ad4-b91f-0028568a79c3/registry-server/0.log" Dec 16 15:58:02 crc kubenswrapper[4775]: I1216 15:58:02.868621 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:58:02 crc kubenswrapper[4775]: I1216 15:58:02.869186 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:58:32 crc kubenswrapper[4775]: I1216 15:58:32.869374 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:58:32 crc kubenswrapper[4775]: I1216 15:58:32.870041 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:58:32 crc kubenswrapper[4775]: I1216 15:58:32.870104 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 15:58:32 crc kubenswrapper[4775]: I1216 15:58:32.871048 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf"} pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:58:32 crc kubenswrapper[4775]: I1216 15:58:32.871109 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" containerID="cri-o://a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" gracePeriod=600 Dec 16 15:58:33 crc kubenswrapper[4775]: E1216 15:58:33.147241 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:58:33 crc kubenswrapper[4775]: I1216 15:58:33.990163 4775 generic.go:334] "Generic (PLEG): container finished" podID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" exitCode=0 Dec 16 15:58:33 crc kubenswrapper[4775]: I1216 15:58:33.990667 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerDied","Data":"a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf"} Dec 16 15:58:33 crc kubenswrapper[4775]: I1216 15:58:33.990713 4775 scope.go:117] "RemoveContainer" containerID="e1c8d61f3889d5bd1528924be8d5a0555fad8c946aa20ed5020d09539894e766" Dec 16 15:58:33 crc kubenswrapper[4775]: I1216 15:58:33.991477 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 15:58:33 crc kubenswrapper[4775]: E1216 15:58:33.991919 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:58:47 crc kubenswrapper[4775]: I1216 15:58:47.339489 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 15:58:47 crc kubenswrapper[4775]: E1216 15:58:47.340484 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:58:58 crc kubenswrapper[4775]: I1216 15:58:58.338432 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 15:58:58 crc kubenswrapper[4775]: E1216 15:58:58.339200 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:59:11 crc kubenswrapper[4775]: I1216 15:59:11.338267 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 15:59:11 crc kubenswrapper[4775]: E1216 15:59:11.339469 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:59:22 crc kubenswrapper[4775]: I1216 15:59:22.337567 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 15:59:22 crc kubenswrapper[4775]: E1216 15:59:22.338294 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:59:29 crc kubenswrapper[4775]: I1216 15:59:29.503140 4775 generic.go:334] "Generic (PLEG): container finished" podID="bf793d59-ff27-4cfb-b547-a15d08fc0367" containerID="8d49b382db5514949da76b0c0427c57250e74ef8bcc162314cbae4d103125c55" exitCode=0 Dec 16 15:59:29 crc kubenswrapper[4775]: I1216 15:59:29.503264 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqrn5/must-gather-q7z9n" event={"ID":"bf793d59-ff27-4cfb-b547-a15d08fc0367","Type":"ContainerDied","Data":"8d49b382db5514949da76b0c0427c57250e74ef8bcc162314cbae4d103125c55"} Dec 16 15:59:29 crc kubenswrapper[4775]: I1216 15:59:29.504229 4775 scope.go:117] "RemoveContainer" containerID="8d49b382db5514949da76b0c0427c57250e74ef8bcc162314cbae4d103125c55" Dec 16 15:59:30 crc kubenswrapper[4775]: I1216 15:59:30.454024 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kqrn5_must-gather-q7z9n_bf793d59-ff27-4cfb-b547-a15d08fc0367/gather/0.log" Dec 16 15:59:37 crc kubenswrapper[4775]: I1216 15:59:37.338824 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 15:59:37 crc kubenswrapper[4775]: E1216 15:59:37.340039 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 15:59:39 crc kubenswrapper[4775]: I1216 15:59:39.750138 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kqrn5/must-gather-q7z9n"] Dec 16 15:59:39 crc kubenswrapper[4775]: I1216 15:59:39.751197 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-kqrn5/must-gather-q7z9n" podUID="bf793d59-ff27-4cfb-b547-a15d08fc0367" containerName="copy" containerID="cri-o://4d5106ae799fba21c2d4dbc5de0a988fb75c578f4acac150921ad480e88537f0" gracePeriod=2 Dec 16 15:59:39 crc kubenswrapper[4775]: I1216 15:59:39.761572 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kqrn5/must-gather-q7z9n"] Dec 16 15:59:40 crc kubenswrapper[4775]: I1216 15:59:40.298868 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kqrn5_must-gather-q7z9n_bf793d59-ff27-4cfb-b547-a15d08fc0367/copy/0.log" Dec 16 15:59:40 crc kubenswrapper[4775]: I1216 15:59:40.299392 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqrn5/must-gather-q7z9n" Dec 16 15:59:40 crc kubenswrapper[4775]: I1216 15:59:40.463422 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbsq8\" (UniqueName: \"kubernetes.io/projected/bf793d59-ff27-4cfb-b547-a15d08fc0367-kube-api-access-rbsq8\") pod \"bf793d59-ff27-4cfb-b547-a15d08fc0367\" (UID: \"bf793d59-ff27-4cfb-b547-a15d08fc0367\") " Dec 16 15:59:40 crc kubenswrapper[4775]: I1216 15:59:40.463617 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bf793d59-ff27-4cfb-b547-a15d08fc0367-must-gather-output\") pod \"bf793d59-ff27-4cfb-b547-a15d08fc0367\" (UID: \"bf793d59-ff27-4cfb-b547-a15d08fc0367\") " Dec 16 15:59:40 crc kubenswrapper[4775]: I1216 15:59:40.470241 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf793d59-ff27-4cfb-b547-a15d08fc0367-kube-api-access-rbsq8" (OuterVolumeSpecName: "kube-api-access-rbsq8") pod "bf793d59-ff27-4cfb-b547-a15d08fc0367" (UID: "bf793d59-ff27-4cfb-b547-a15d08fc0367"). InnerVolumeSpecName "kube-api-access-rbsq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:59:40 crc kubenswrapper[4775]: I1216 15:59:40.566009 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbsq8\" (UniqueName: \"kubernetes.io/projected/bf793d59-ff27-4cfb-b547-a15d08fc0367-kube-api-access-rbsq8\") on node \"crc\" DevicePath \"\"" Dec 16 15:59:40 crc kubenswrapper[4775]: I1216 15:59:40.613952 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kqrn5_must-gather-q7z9n_bf793d59-ff27-4cfb-b547-a15d08fc0367/copy/0.log" Dec 16 15:59:40 crc kubenswrapper[4775]: I1216 15:59:40.614558 4775 generic.go:334] "Generic (PLEG): container finished" podID="bf793d59-ff27-4cfb-b547-a15d08fc0367" containerID="4d5106ae799fba21c2d4dbc5de0a988fb75c578f4acac150921ad480e88537f0" exitCode=143 Dec 16 15:59:40 crc kubenswrapper[4775]: I1216 15:59:40.614662 4775 scope.go:117] "RemoveContainer" containerID="4d5106ae799fba21c2d4dbc5de0a988fb75c578f4acac150921ad480e88537f0" Dec 16 15:59:40 crc kubenswrapper[4775]: I1216 15:59:40.614940 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqrn5/must-gather-q7z9n" Dec 16 15:59:40 crc kubenswrapper[4775]: I1216 15:59:40.628522 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf793d59-ff27-4cfb-b547-a15d08fc0367-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "bf793d59-ff27-4cfb-b547-a15d08fc0367" (UID: "bf793d59-ff27-4cfb-b547-a15d08fc0367"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:59:40 crc kubenswrapper[4775]: I1216 15:59:40.636412 4775 scope.go:117] "RemoveContainer" containerID="8d49b382db5514949da76b0c0427c57250e74ef8bcc162314cbae4d103125c55" Dec 16 15:59:40 crc kubenswrapper[4775]: I1216 15:59:40.668908 4775 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bf793d59-ff27-4cfb-b547-a15d08fc0367-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 16 15:59:40 crc kubenswrapper[4775]: I1216 15:59:40.709932 4775 scope.go:117] "RemoveContainer" containerID="4d5106ae799fba21c2d4dbc5de0a988fb75c578f4acac150921ad480e88537f0" Dec 16 15:59:40 crc kubenswrapper[4775]: E1216 15:59:40.710519 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d5106ae799fba21c2d4dbc5de0a988fb75c578f4acac150921ad480e88537f0\": container with ID starting with 4d5106ae799fba21c2d4dbc5de0a988fb75c578f4acac150921ad480e88537f0 not found: ID does not exist" containerID="4d5106ae799fba21c2d4dbc5de0a988fb75c578f4acac150921ad480e88537f0" Dec 16 15:59:40 crc kubenswrapper[4775]: I1216 15:59:40.710576 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d5106ae799fba21c2d4dbc5de0a988fb75c578f4acac150921ad480e88537f0"} err="failed to get container status \"4d5106ae799fba21c2d4dbc5de0a988fb75c578f4acac150921ad480e88537f0\": rpc error: code = NotFound desc = could not find container \"4d5106ae799fba21c2d4dbc5de0a988fb75c578f4acac150921ad480e88537f0\": container with ID starting with 4d5106ae799fba21c2d4dbc5de0a988fb75c578f4acac150921ad480e88537f0 not found: ID does not exist" Dec 16 15:59:40 crc kubenswrapper[4775]: I1216 15:59:40.710603 4775 scope.go:117] "RemoveContainer" containerID="8d49b382db5514949da76b0c0427c57250e74ef8bcc162314cbae4d103125c55" Dec 16 15:59:40 crc kubenswrapper[4775]: E1216 15:59:40.712128 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d49b382db5514949da76b0c0427c57250e74ef8bcc162314cbae4d103125c55\": container with ID starting with 8d49b382db5514949da76b0c0427c57250e74ef8bcc162314cbae4d103125c55 not found: ID does not exist" containerID="8d49b382db5514949da76b0c0427c57250e74ef8bcc162314cbae4d103125c55" Dec 16 15:59:40 crc kubenswrapper[4775]: I1216 15:59:40.712165 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d49b382db5514949da76b0c0427c57250e74ef8bcc162314cbae4d103125c55"} err="failed to get container status \"8d49b382db5514949da76b0c0427c57250e74ef8bcc162314cbae4d103125c55\": rpc error: code = NotFound desc = could not find container \"8d49b382db5514949da76b0c0427c57250e74ef8bcc162314cbae4d103125c55\": container with ID starting with 8d49b382db5514949da76b0c0427c57250e74ef8bcc162314cbae4d103125c55 not found: ID does not exist" Dec 16 15:59:41 crc kubenswrapper[4775]: I1216 15:59:41.349435 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf793d59-ff27-4cfb-b547-a15d08fc0367" path="/var/lib/kubelet/pods/bf793d59-ff27-4cfb-b547-a15d08fc0367/volumes" Dec 16 15:59:51 crc kubenswrapper[4775]: I1216 15:59:51.338298 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 15:59:51 crc kubenswrapper[4775]: E1216 15:59:51.339211 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.195557 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn"] Dec 16 16:00:00 crc kubenswrapper[4775]: E1216 16:00:00.196678 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2" containerName="container-00" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.196707 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2" containerName="container-00" Dec 16 16:00:00 crc kubenswrapper[4775]: E1216 16:00:00.196730 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf793d59-ff27-4cfb-b547-a15d08fc0367" containerName="gather" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.196738 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf793d59-ff27-4cfb-b547-a15d08fc0367" containerName="gather" Dec 16 16:00:00 crc kubenswrapper[4775]: E1216 16:00:00.196780 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf793d59-ff27-4cfb-b547-a15d08fc0367" containerName="copy" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.196788 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf793d59-ff27-4cfb-b547-a15d08fc0367" containerName="copy" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.197051 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf793d59-ff27-4cfb-b547-a15d08fc0367" containerName="copy" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.197106 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf793d59-ff27-4cfb-b547-a15d08fc0367" containerName="gather" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.197125 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c29df4d2-4bcd-4d8f-9ed1-b732ac615cd2" containerName="container-00" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.197898 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.200158 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.200870 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.205358 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn"] Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.361940 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8988c194-9903-4d8e-b106-766c9a8a9d58-config-volume\") pod \"collect-profiles-29431680-c2xdn\" (UID: \"8988c194-9903-4d8e-b106-766c9a8a9d58\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.362002 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8988c194-9903-4d8e-b106-766c9a8a9d58-secret-volume\") pod \"collect-profiles-29431680-c2xdn\" (UID: \"8988c194-9903-4d8e-b106-766c9a8a9d58\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.362137 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qncmj\" (UniqueName: \"kubernetes.io/projected/8988c194-9903-4d8e-b106-766c9a8a9d58-kube-api-access-qncmj\") pod \"collect-profiles-29431680-c2xdn\" (UID: \"8988c194-9903-4d8e-b106-766c9a8a9d58\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.463502 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qncmj\" (UniqueName: \"kubernetes.io/projected/8988c194-9903-4d8e-b106-766c9a8a9d58-kube-api-access-qncmj\") pod \"collect-profiles-29431680-c2xdn\" (UID: \"8988c194-9903-4d8e-b106-766c9a8a9d58\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.463734 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8988c194-9903-4d8e-b106-766c9a8a9d58-config-volume\") pod \"collect-profiles-29431680-c2xdn\" (UID: \"8988c194-9903-4d8e-b106-766c9a8a9d58\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.463787 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8988c194-9903-4d8e-b106-766c9a8a9d58-secret-volume\") pod \"collect-profiles-29431680-c2xdn\" (UID: \"8988c194-9903-4d8e-b106-766c9a8a9d58\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.465352 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8988c194-9903-4d8e-b106-766c9a8a9d58-config-volume\") pod \"collect-profiles-29431680-c2xdn\" (UID: \"8988c194-9903-4d8e-b106-766c9a8a9d58\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.469722 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8988c194-9903-4d8e-b106-766c9a8a9d58-secret-volume\") pod \"collect-profiles-29431680-c2xdn\" (UID: \"8988c194-9903-4d8e-b106-766c9a8a9d58\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.491391 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qncmj\" (UniqueName: \"kubernetes.io/projected/8988c194-9903-4d8e-b106-766c9a8a9d58-kube-api-access-qncmj\") pod \"collect-profiles-29431680-c2xdn\" (UID: \"8988c194-9903-4d8e-b106-766c9a8a9d58\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.529001 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn" Dec 16 16:00:00 crc kubenswrapper[4775]: I1216 16:00:00.954913 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn"] Dec 16 16:00:00 crc kubenswrapper[4775]: W1216 16:00:00.973752 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8988c194_9903_4d8e_b106_766c9a8a9d58.slice/crio-315facbb5ab5e7e9e94f15c544df1f0349e1ea061f2d73fd84dd4254a58f0a0b WatchSource:0}: Error finding container 315facbb5ab5e7e9e94f15c544df1f0349e1ea061f2d73fd84dd4254a58f0a0b: Status 404 returned error can't find the container with id 315facbb5ab5e7e9e94f15c544df1f0349e1ea061f2d73fd84dd4254a58f0a0b Dec 16 16:00:01 crc kubenswrapper[4775]: I1216 16:00:01.804632 4775 generic.go:334] "Generic (PLEG): container finished" podID="8988c194-9903-4d8e-b106-766c9a8a9d58" containerID="6409df9805cf20f72ef64e0dcc72164bc3ee08ffa327523534d0175bc479c1a0" exitCode=0 Dec 16 16:00:01 crc kubenswrapper[4775]: I1216 16:00:01.805118 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn" event={"ID":"8988c194-9903-4d8e-b106-766c9a8a9d58","Type":"ContainerDied","Data":"6409df9805cf20f72ef64e0dcc72164bc3ee08ffa327523534d0175bc479c1a0"} Dec 16 16:00:01 crc kubenswrapper[4775]: I1216 16:00:01.805149 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn" event={"ID":"8988c194-9903-4d8e-b106-766c9a8a9d58","Type":"ContainerStarted","Data":"315facbb5ab5e7e9e94f15c544df1f0349e1ea061f2d73fd84dd4254a58f0a0b"} Dec 16 16:00:03 crc kubenswrapper[4775]: I1216 16:00:03.259936 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn" Dec 16 16:00:03 crc kubenswrapper[4775]: I1216 16:00:03.338258 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:00:03 crc kubenswrapper[4775]: E1216 16:00:03.338603 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:00:03 crc kubenswrapper[4775]: I1216 16:00:03.419114 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qncmj\" (UniqueName: \"kubernetes.io/projected/8988c194-9903-4d8e-b106-766c9a8a9d58-kube-api-access-qncmj\") pod \"8988c194-9903-4d8e-b106-766c9a8a9d58\" (UID: \"8988c194-9903-4d8e-b106-766c9a8a9d58\") " Dec 16 16:00:03 crc kubenswrapper[4775]: I1216 16:00:03.419352 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8988c194-9903-4d8e-b106-766c9a8a9d58-config-volume\") pod \"8988c194-9903-4d8e-b106-766c9a8a9d58\" (UID: \"8988c194-9903-4d8e-b106-766c9a8a9d58\") " Dec 16 16:00:03 crc kubenswrapper[4775]: I1216 16:00:03.419492 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8988c194-9903-4d8e-b106-766c9a8a9d58-secret-volume\") pod \"8988c194-9903-4d8e-b106-766c9a8a9d58\" (UID: \"8988c194-9903-4d8e-b106-766c9a8a9d58\") " Dec 16 16:00:03 crc kubenswrapper[4775]: I1216 16:00:03.419948 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8988c194-9903-4d8e-b106-766c9a8a9d58-config-volume" (OuterVolumeSpecName: "config-volume") pod "8988c194-9903-4d8e-b106-766c9a8a9d58" (UID: "8988c194-9903-4d8e-b106-766c9a8a9d58"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 16:00:03 crc kubenswrapper[4775]: I1216 16:00:03.424976 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8988c194-9903-4d8e-b106-766c9a8a9d58-kube-api-access-qncmj" (OuterVolumeSpecName: "kube-api-access-qncmj") pod "8988c194-9903-4d8e-b106-766c9a8a9d58" (UID: "8988c194-9903-4d8e-b106-766c9a8a9d58"). InnerVolumeSpecName "kube-api-access-qncmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:00:03 crc kubenswrapper[4775]: I1216 16:00:03.427385 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8988c194-9903-4d8e-b106-766c9a8a9d58-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8988c194-9903-4d8e-b106-766c9a8a9d58" (UID: "8988c194-9903-4d8e-b106-766c9a8a9d58"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 16:00:03 crc kubenswrapper[4775]: I1216 16:00:03.522306 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qncmj\" (UniqueName: \"kubernetes.io/projected/8988c194-9903-4d8e-b106-766c9a8a9d58-kube-api-access-qncmj\") on node \"crc\" DevicePath \"\"" Dec 16 16:00:03 crc kubenswrapper[4775]: I1216 16:00:03.522367 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8988c194-9903-4d8e-b106-766c9a8a9d58-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 16:00:03 crc kubenswrapper[4775]: I1216 16:00:03.522386 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8988c194-9903-4d8e-b106-766c9a8a9d58-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 16:00:03 crc kubenswrapper[4775]: I1216 16:00:03.825183 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn" Dec 16 16:00:03 crc kubenswrapper[4775]: I1216 16:00:03.834629 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-c2xdn" event={"ID":"8988c194-9903-4d8e-b106-766c9a8a9d58","Type":"ContainerDied","Data":"315facbb5ab5e7e9e94f15c544df1f0349e1ea061f2d73fd84dd4254a58f0a0b"} Dec 16 16:00:03 crc kubenswrapper[4775]: I1216 16:00:03.834665 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="315facbb5ab5e7e9e94f15c544df1f0349e1ea061f2d73fd84dd4254a58f0a0b" Dec 16 16:00:04 crc kubenswrapper[4775]: I1216 16:00:04.343085 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm"] Dec 16 16:00:04 crc kubenswrapper[4775]: I1216 16:00:04.356629 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431635-s9dsm"] Dec 16 16:00:05 crc kubenswrapper[4775]: I1216 16:00:05.347057 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96" path="/var/lib/kubelet/pods/ea2d7de9-7ba8-4595-a97b-0eaec0b4ba96/volumes" Dec 16 16:00:16 crc kubenswrapper[4775]: I1216 16:00:16.337479 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:00:16 crc kubenswrapper[4775]: E1216 16:00:16.338317 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:00:27 crc kubenswrapper[4775]: I1216 16:00:27.338547 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:00:27 crc kubenswrapper[4775]: E1216 16:00:27.339460 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:00:38 crc kubenswrapper[4775]: I1216 16:00:38.347058 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:00:38 crc kubenswrapper[4775]: E1216 16:00:38.348042 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:00:49 crc kubenswrapper[4775]: I1216 16:00:49.555213 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gr52q"] Dec 16 16:00:49 crc kubenswrapper[4775]: E1216 16:00:49.556307 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8988c194-9903-4d8e-b106-766c9a8a9d58" containerName="collect-profiles" Dec 16 16:00:49 crc kubenswrapper[4775]: I1216 16:00:49.556327 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8988c194-9903-4d8e-b106-766c9a8a9d58" containerName="collect-profiles" Dec 16 16:00:49 crc kubenswrapper[4775]: I1216 16:00:49.556575 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8988c194-9903-4d8e-b106-766c9a8a9d58" containerName="collect-profiles" Dec 16 16:00:49 crc kubenswrapper[4775]: I1216 16:00:49.558212 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gr52q" Dec 16 16:00:49 crc kubenswrapper[4775]: I1216 16:00:49.564950 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gr52q"] Dec 16 16:00:49 crc kubenswrapper[4775]: I1216 16:00:49.571220 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/372d939b-c26e-454a-ab35-dfb3b351bf87-utilities\") pod \"redhat-operators-gr52q\" (UID: \"372d939b-c26e-454a-ab35-dfb3b351bf87\") " pod="openshift-marketplace/redhat-operators-gr52q" Dec 16 16:00:49 crc kubenswrapper[4775]: I1216 16:00:49.571328 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbbtk\" (UniqueName: \"kubernetes.io/projected/372d939b-c26e-454a-ab35-dfb3b351bf87-kube-api-access-wbbtk\") pod \"redhat-operators-gr52q\" (UID: \"372d939b-c26e-454a-ab35-dfb3b351bf87\") " pod="openshift-marketplace/redhat-operators-gr52q" Dec 16 16:00:49 crc kubenswrapper[4775]: I1216 16:00:49.571657 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/372d939b-c26e-454a-ab35-dfb3b351bf87-catalog-content\") pod \"redhat-operators-gr52q\" (UID: \"372d939b-c26e-454a-ab35-dfb3b351bf87\") " pod="openshift-marketplace/redhat-operators-gr52q" Dec 16 16:00:49 crc kubenswrapper[4775]: I1216 16:00:49.672705 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/372d939b-c26e-454a-ab35-dfb3b351bf87-catalog-content\") pod \"redhat-operators-gr52q\" (UID: \"372d939b-c26e-454a-ab35-dfb3b351bf87\") " pod="openshift-marketplace/redhat-operators-gr52q" Dec 16 16:00:49 crc kubenswrapper[4775]: I1216 16:00:49.672818 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/372d939b-c26e-454a-ab35-dfb3b351bf87-utilities\") pod \"redhat-operators-gr52q\" (UID: \"372d939b-c26e-454a-ab35-dfb3b351bf87\") " pod="openshift-marketplace/redhat-operators-gr52q" Dec 16 16:00:49 crc kubenswrapper[4775]: I1216 16:00:49.672871 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbbtk\" (UniqueName: \"kubernetes.io/projected/372d939b-c26e-454a-ab35-dfb3b351bf87-kube-api-access-wbbtk\") pod \"redhat-operators-gr52q\" (UID: \"372d939b-c26e-454a-ab35-dfb3b351bf87\") " pod="openshift-marketplace/redhat-operators-gr52q" Dec 16 16:00:49 crc kubenswrapper[4775]: I1216 16:00:49.673232 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/372d939b-c26e-454a-ab35-dfb3b351bf87-catalog-content\") pod \"redhat-operators-gr52q\" (UID: \"372d939b-c26e-454a-ab35-dfb3b351bf87\") " pod="openshift-marketplace/redhat-operators-gr52q" Dec 16 16:00:49 crc kubenswrapper[4775]: I1216 16:00:49.673372 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/372d939b-c26e-454a-ab35-dfb3b351bf87-utilities\") pod \"redhat-operators-gr52q\" (UID: \"372d939b-c26e-454a-ab35-dfb3b351bf87\") " pod="openshift-marketplace/redhat-operators-gr52q" Dec 16 16:00:49 crc kubenswrapper[4775]: I1216 16:00:49.694144 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbbtk\" (UniqueName: \"kubernetes.io/projected/372d939b-c26e-454a-ab35-dfb3b351bf87-kube-api-access-wbbtk\") pod \"redhat-operators-gr52q\" (UID: \"372d939b-c26e-454a-ab35-dfb3b351bf87\") " pod="openshift-marketplace/redhat-operators-gr52q" Dec 16 16:00:49 crc kubenswrapper[4775]: I1216 16:00:49.879785 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gr52q" Dec 16 16:00:50 crc kubenswrapper[4775]: I1216 16:00:50.305628 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gr52q"] Dec 16 16:00:51 crc kubenswrapper[4775]: I1216 16:00:51.249700 4775 generic.go:334] "Generic (PLEG): container finished" podID="372d939b-c26e-454a-ab35-dfb3b351bf87" containerID="5f5b68174cc63aad0a59f283641f6490cdfbb98d6c0d8d01e10dc88b5aa03b9e" exitCode=0 Dec 16 16:00:51 crc kubenswrapper[4775]: I1216 16:00:51.249915 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr52q" event={"ID":"372d939b-c26e-454a-ab35-dfb3b351bf87","Type":"ContainerDied","Data":"5f5b68174cc63aad0a59f283641f6490cdfbb98d6c0d8d01e10dc88b5aa03b9e"} Dec 16 16:00:51 crc kubenswrapper[4775]: I1216 16:00:51.253539 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr52q" event={"ID":"372d939b-c26e-454a-ab35-dfb3b351bf87","Type":"ContainerStarted","Data":"f62ca9a0560dc7192e05969fe693ffcaace0ca676d60e1e17d13917f504ff25b"} Dec 16 16:00:51 crc kubenswrapper[4775]: I1216 16:00:51.252390 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 16:00:52 crc kubenswrapper[4775]: I1216 16:00:52.337946 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:00:52 crc kubenswrapper[4775]: E1216 16:00:52.338651 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:00:53 crc kubenswrapper[4775]: I1216 16:00:53.275349 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr52q" event={"ID":"372d939b-c26e-454a-ab35-dfb3b351bf87","Type":"ContainerStarted","Data":"fba3a2e8c917b24346b16a6e1befbb10815a0d52f35e61aa04c17d467e048dfc"} Dec 16 16:00:56 crc kubenswrapper[4775]: I1216 16:00:56.303264 4775 generic.go:334] "Generic (PLEG): container finished" podID="372d939b-c26e-454a-ab35-dfb3b351bf87" containerID="fba3a2e8c917b24346b16a6e1befbb10815a0d52f35e61aa04c17d467e048dfc" exitCode=0 Dec 16 16:00:56 crc kubenswrapper[4775]: I1216 16:00:56.303367 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr52q" event={"ID":"372d939b-c26e-454a-ab35-dfb3b351bf87","Type":"ContainerDied","Data":"fba3a2e8c917b24346b16a6e1befbb10815a0d52f35e61aa04c17d467e048dfc"} Dec 16 16:00:58 crc kubenswrapper[4775]: I1216 16:00:58.324191 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr52q" event={"ID":"372d939b-c26e-454a-ab35-dfb3b351bf87","Type":"ContainerStarted","Data":"aa2a2554131103ebca04bbd9babab9cfc4591901ba56b06cb1c0b4e27ee006be"} Dec 16 16:00:58 crc kubenswrapper[4775]: I1216 16:00:58.345081 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gr52q" podStartSLOduration=3.1094601 podStartE2EDuration="9.345059578s" podCreationTimestamp="2025-12-16 16:00:49 +0000 UTC" firstStartedPulling="2025-12-16 16:00:51.252089774 +0000 UTC m=+3976.203168707" lastFinishedPulling="2025-12-16 16:00:57.487689262 +0000 UTC m=+3982.438768185" observedRunningTime="2025-12-16 16:00:58.338557534 +0000 UTC m=+3983.289636477" watchObservedRunningTime="2025-12-16 16:00:58.345059578 +0000 UTC m=+3983.296138501" Dec 16 16:00:58 crc kubenswrapper[4775]: I1216 16:00:58.656840 4775 scope.go:117] "RemoveContainer" containerID="2692c3d7920252eec3388e2da208e081057391885af8bff644c39f8a88abac7a" Dec 16 16:00:58 crc kubenswrapper[4775]: I1216 16:00:58.691670 4775 scope.go:117] "RemoveContainer" containerID="95c74dbb0eb319a6f6b01a923daefdf384e25eab4dbd3b122d18d3d5fdf16d24" Dec 16 16:00:59 crc kubenswrapper[4775]: I1216 16:00:59.881060 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gr52q" Dec 16 16:00:59 crc kubenswrapper[4775]: I1216 16:00:59.882749 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gr52q" Dec 16 16:01:00 crc kubenswrapper[4775]: I1216 16:01:00.173477 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29431681-lc6pz"] Dec 16 16:01:00 crc kubenswrapper[4775]: I1216 16:01:00.175098 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431681-lc6pz" Dec 16 16:01:00 crc kubenswrapper[4775]: I1216 16:01:00.182023 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29431681-lc6pz"] Dec 16 16:01:00 crc kubenswrapper[4775]: I1216 16:01:00.273859 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7d3155-afab-4f73-98e3-f2ea11f36050-config-data\") pod \"keystone-cron-29431681-lc6pz\" (UID: \"1e7d3155-afab-4f73-98e3-f2ea11f36050\") " pod="openstack/keystone-cron-29431681-lc6pz" Dec 16 16:01:00 crc kubenswrapper[4775]: I1216 16:01:00.274117 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e7d3155-afab-4f73-98e3-f2ea11f36050-fernet-keys\") pod \"keystone-cron-29431681-lc6pz\" (UID: \"1e7d3155-afab-4f73-98e3-f2ea11f36050\") " pod="openstack/keystone-cron-29431681-lc6pz" Dec 16 16:01:00 crc kubenswrapper[4775]: I1216 16:01:00.274267 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7d3155-afab-4f73-98e3-f2ea11f36050-combined-ca-bundle\") pod \"keystone-cron-29431681-lc6pz\" (UID: \"1e7d3155-afab-4f73-98e3-f2ea11f36050\") " pod="openstack/keystone-cron-29431681-lc6pz" Dec 16 16:01:00 crc kubenswrapper[4775]: I1216 16:01:00.274451 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7vxm\" (UniqueName: \"kubernetes.io/projected/1e7d3155-afab-4f73-98e3-f2ea11f36050-kube-api-access-l7vxm\") pod \"keystone-cron-29431681-lc6pz\" (UID: \"1e7d3155-afab-4f73-98e3-f2ea11f36050\") " pod="openstack/keystone-cron-29431681-lc6pz" Dec 16 16:01:00 crc kubenswrapper[4775]: I1216 16:01:00.376703 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7d3155-afab-4f73-98e3-f2ea11f36050-config-data\") pod \"keystone-cron-29431681-lc6pz\" (UID: \"1e7d3155-afab-4f73-98e3-f2ea11f36050\") " pod="openstack/keystone-cron-29431681-lc6pz" Dec 16 16:01:00 crc kubenswrapper[4775]: I1216 16:01:00.376876 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e7d3155-afab-4f73-98e3-f2ea11f36050-fernet-keys\") pod \"keystone-cron-29431681-lc6pz\" (UID: \"1e7d3155-afab-4f73-98e3-f2ea11f36050\") " pod="openstack/keystone-cron-29431681-lc6pz" Dec 16 16:01:00 crc kubenswrapper[4775]: I1216 16:01:00.376985 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7d3155-afab-4f73-98e3-f2ea11f36050-combined-ca-bundle\") pod \"keystone-cron-29431681-lc6pz\" (UID: \"1e7d3155-afab-4f73-98e3-f2ea11f36050\") " pod="openstack/keystone-cron-29431681-lc6pz" Dec 16 16:01:00 crc kubenswrapper[4775]: I1216 16:01:00.377078 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7vxm\" (UniqueName: \"kubernetes.io/projected/1e7d3155-afab-4f73-98e3-f2ea11f36050-kube-api-access-l7vxm\") pod \"keystone-cron-29431681-lc6pz\" (UID: \"1e7d3155-afab-4f73-98e3-f2ea11f36050\") " pod="openstack/keystone-cron-29431681-lc6pz" Dec 16 16:01:00 crc kubenswrapper[4775]: I1216 16:01:00.383268 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7d3155-afab-4f73-98e3-f2ea11f36050-config-data\") pod \"keystone-cron-29431681-lc6pz\" (UID: \"1e7d3155-afab-4f73-98e3-f2ea11f36050\") " pod="openstack/keystone-cron-29431681-lc6pz" Dec 16 16:01:00 crc kubenswrapper[4775]: I1216 16:01:00.383905 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e7d3155-afab-4f73-98e3-f2ea11f36050-fernet-keys\") pod \"keystone-cron-29431681-lc6pz\" (UID: \"1e7d3155-afab-4f73-98e3-f2ea11f36050\") " pod="openstack/keystone-cron-29431681-lc6pz" Dec 16 16:01:00 crc kubenswrapper[4775]: I1216 16:01:00.384446 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7d3155-afab-4f73-98e3-f2ea11f36050-combined-ca-bundle\") pod \"keystone-cron-29431681-lc6pz\" (UID: \"1e7d3155-afab-4f73-98e3-f2ea11f36050\") " pod="openstack/keystone-cron-29431681-lc6pz" Dec 16 16:01:00 crc kubenswrapper[4775]: I1216 16:01:00.396082 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7vxm\" (UniqueName: \"kubernetes.io/projected/1e7d3155-afab-4f73-98e3-f2ea11f36050-kube-api-access-l7vxm\") pod \"keystone-cron-29431681-lc6pz\" (UID: \"1e7d3155-afab-4f73-98e3-f2ea11f36050\") " pod="openstack/keystone-cron-29431681-lc6pz" Dec 16 16:01:00 crc kubenswrapper[4775]: I1216 16:01:00.502098 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431681-lc6pz" Dec 16 16:01:00 crc kubenswrapper[4775]: I1216 16:01:00.936901 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gr52q" podUID="372d939b-c26e-454a-ab35-dfb3b351bf87" containerName="registry-server" probeResult="failure" output=< Dec 16 16:01:00 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Dec 16 16:01:00 crc kubenswrapper[4775]: > Dec 16 16:01:00 crc kubenswrapper[4775]: I1216 16:01:00.952845 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29431681-lc6pz"] Dec 16 16:01:00 crc kubenswrapper[4775]: W1216 16:01:00.958474 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e7d3155_afab_4f73_98e3_f2ea11f36050.slice/crio-063ae74391cfa49b702751c6be1bad81fd869bbb8b0b1aaff023cfe11dc12dd3 WatchSource:0}: Error finding container 063ae74391cfa49b702751c6be1bad81fd869bbb8b0b1aaff023cfe11dc12dd3: Status 404 returned error can't find the container with id 063ae74391cfa49b702751c6be1bad81fd869bbb8b0b1aaff023cfe11dc12dd3 Dec 16 16:01:01 crc kubenswrapper[4775]: I1216 16:01:01.364910 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431681-lc6pz" event={"ID":"1e7d3155-afab-4f73-98e3-f2ea11f36050","Type":"ContainerStarted","Data":"2242ff4ad15cd38a466ae7069e6453f8db497a36653fff3555a98e5bfd5c3faa"} Dec 16 16:01:01 crc kubenswrapper[4775]: I1216 16:01:01.365261 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431681-lc6pz" event={"ID":"1e7d3155-afab-4f73-98e3-f2ea11f36050","Type":"ContainerStarted","Data":"063ae74391cfa49b702751c6be1bad81fd869bbb8b0b1aaff023cfe11dc12dd3"} Dec 16 16:01:01 crc kubenswrapper[4775]: I1216 16:01:01.386985 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29431681-lc6pz" podStartSLOduration=1.38696194 podStartE2EDuration="1.38696194s" podCreationTimestamp="2025-12-16 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 16:01:01.377497661 +0000 UTC m=+3986.328576594" watchObservedRunningTime="2025-12-16 16:01:01.38696194 +0000 UTC m=+3986.338040863" Dec 16 16:01:03 crc kubenswrapper[4775]: I1216 16:01:03.338215 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:01:03 crc kubenswrapper[4775]: E1216 16:01:03.338716 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:01:04 crc kubenswrapper[4775]: I1216 16:01:04.392353 4775 generic.go:334] "Generic (PLEG): container finished" podID="1e7d3155-afab-4f73-98e3-f2ea11f36050" containerID="2242ff4ad15cd38a466ae7069e6453f8db497a36653fff3555a98e5bfd5c3faa" exitCode=0 Dec 16 16:01:04 crc kubenswrapper[4775]: I1216 16:01:04.392445 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431681-lc6pz" event={"ID":"1e7d3155-afab-4f73-98e3-f2ea11f36050","Type":"ContainerDied","Data":"2242ff4ad15cd38a466ae7069e6453f8db497a36653fff3555a98e5bfd5c3faa"} Dec 16 16:01:05 crc kubenswrapper[4775]: I1216 16:01:05.757084 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431681-lc6pz" Dec 16 16:01:05 crc kubenswrapper[4775]: I1216 16:01:05.883536 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7d3155-afab-4f73-98e3-f2ea11f36050-config-data\") pod \"1e7d3155-afab-4f73-98e3-f2ea11f36050\" (UID: \"1e7d3155-afab-4f73-98e3-f2ea11f36050\") " Dec 16 16:01:05 crc kubenswrapper[4775]: I1216 16:01:05.883594 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7d3155-afab-4f73-98e3-f2ea11f36050-combined-ca-bundle\") pod \"1e7d3155-afab-4f73-98e3-f2ea11f36050\" (UID: \"1e7d3155-afab-4f73-98e3-f2ea11f36050\") " Dec 16 16:01:05 crc kubenswrapper[4775]: I1216 16:01:05.883657 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7vxm\" (UniqueName: \"kubernetes.io/projected/1e7d3155-afab-4f73-98e3-f2ea11f36050-kube-api-access-l7vxm\") pod \"1e7d3155-afab-4f73-98e3-f2ea11f36050\" (UID: \"1e7d3155-afab-4f73-98e3-f2ea11f36050\") " Dec 16 16:01:05 crc kubenswrapper[4775]: I1216 16:01:05.883713 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e7d3155-afab-4f73-98e3-f2ea11f36050-fernet-keys\") pod \"1e7d3155-afab-4f73-98e3-f2ea11f36050\" (UID: \"1e7d3155-afab-4f73-98e3-f2ea11f36050\") " Dec 16 16:01:05 crc kubenswrapper[4775]: I1216 16:01:05.889554 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e7d3155-afab-4f73-98e3-f2ea11f36050-kube-api-access-l7vxm" (OuterVolumeSpecName: "kube-api-access-l7vxm") pod "1e7d3155-afab-4f73-98e3-f2ea11f36050" (UID: "1e7d3155-afab-4f73-98e3-f2ea11f36050"). InnerVolumeSpecName "kube-api-access-l7vxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:01:05 crc kubenswrapper[4775]: I1216 16:01:05.889566 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e7d3155-afab-4f73-98e3-f2ea11f36050-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1e7d3155-afab-4f73-98e3-f2ea11f36050" (UID: "1e7d3155-afab-4f73-98e3-f2ea11f36050"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 16:01:05 crc kubenswrapper[4775]: I1216 16:01:05.986121 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7vxm\" (UniqueName: \"kubernetes.io/projected/1e7d3155-afab-4f73-98e3-f2ea11f36050-kube-api-access-l7vxm\") on node \"crc\" DevicePath \"\"" Dec 16 16:01:05 crc kubenswrapper[4775]: I1216 16:01:05.986153 4775 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e7d3155-afab-4f73-98e3-f2ea11f36050-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 16:01:06 crc kubenswrapper[4775]: I1216 16:01:06.001718 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e7d3155-afab-4f73-98e3-f2ea11f36050-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e7d3155-afab-4f73-98e3-f2ea11f36050" (UID: "1e7d3155-afab-4f73-98e3-f2ea11f36050"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 16:01:06 crc kubenswrapper[4775]: I1216 16:01:06.023483 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e7d3155-afab-4f73-98e3-f2ea11f36050-config-data" (OuterVolumeSpecName: "config-data") pod "1e7d3155-afab-4f73-98e3-f2ea11f36050" (UID: "1e7d3155-afab-4f73-98e3-f2ea11f36050"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 16:01:06 crc kubenswrapper[4775]: I1216 16:01:06.088077 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7d3155-afab-4f73-98e3-f2ea11f36050-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 16:01:06 crc kubenswrapper[4775]: I1216 16:01:06.088123 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7d3155-afab-4f73-98e3-f2ea11f36050-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 16:01:06 crc kubenswrapper[4775]: I1216 16:01:06.410826 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431681-lc6pz" event={"ID":"1e7d3155-afab-4f73-98e3-f2ea11f36050","Type":"ContainerDied","Data":"063ae74391cfa49b702751c6be1bad81fd869bbb8b0b1aaff023cfe11dc12dd3"} Dec 16 16:01:06 crc kubenswrapper[4775]: I1216 16:01:06.411150 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="063ae74391cfa49b702751c6be1bad81fd869bbb8b0b1aaff023cfe11dc12dd3" Dec 16 16:01:06 crc kubenswrapper[4775]: I1216 16:01:06.410907 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431681-lc6pz" Dec 16 16:01:09 crc kubenswrapper[4775]: I1216 16:01:09.940791 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gr52q" Dec 16 16:01:10 crc kubenswrapper[4775]: I1216 16:01:10.019257 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gr52q" Dec 16 16:01:10 crc kubenswrapper[4775]: I1216 16:01:10.180141 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gr52q"] Dec 16 16:01:11 crc kubenswrapper[4775]: I1216 16:01:11.457749 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gr52q" podUID="372d939b-c26e-454a-ab35-dfb3b351bf87" containerName="registry-server" containerID="cri-o://aa2a2554131103ebca04bbd9babab9cfc4591901ba56b06cb1c0b4e27ee006be" gracePeriod=2 Dec 16 16:01:11 crc kubenswrapper[4775]: I1216 16:01:11.979099 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gr52q" Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.112800 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbbtk\" (UniqueName: \"kubernetes.io/projected/372d939b-c26e-454a-ab35-dfb3b351bf87-kube-api-access-wbbtk\") pod \"372d939b-c26e-454a-ab35-dfb3b351bf87\" (UID: \"372d939b-c26e-454a-ab35-dfb3b351bf87\") " Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.113046 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/372d939b-c26e-454a-ab35-dfb3b351bf87-utilities\") pod \"372d939b-c26e-454a-ab35-dfb3b351bf87\" (UID: \"372d939b-c26e-454a-ab35-dfb3b351bf87\") " Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.113084 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/372d939b-c26e-454a-ab35-dfb3b351bf87-catalog-content\") pod \"372d939b-c26e-454a-ab35-dfb3b351bf87\" (UID: \"372d939b-c26e-454a-ab35-dfb3b351bf87\") " Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.114188 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/372d939b-c26e-454a-ab35-dfb3b351bf87-utilities" (OuterVolumeSpecName: "utilities") pod "372d939b-c26e-454a-ab35-dfb3b351bf87" (UID: "372d939b-c26e-454a-ab35-dfb3b351bf87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.118394 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/372d939b-c26e-454a-ab35-dfb3b351bf87-kube-api-access-wbbtk" (OuterVolumeSpecName: "kube-api-access-wbbtk") pod "372d939b-c26e-454a-ab35-dfb3b351bf87" (UID: "372d939b-c26e-454a-ab35-dfb3b351bf87"). InnerVolumeSpecName "kube-api-access-wbbtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.215320 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbbtk\" (UniqueName: \"kubernetes.io/projected/372d939b-c26e-454a-ab35-dfb3b351bf87-kube-api-access-wbbtk\") on node \"crc\" DevicePath \"\"" Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.215364 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/372d939b-c26e-454a-ab35-dfb3b351bf87-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.288500 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/372d939b-c26e-454a-ab35-dfb3b351bf87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "372d939b-c26e-454a-ab35-dfb3b351bf87" (UID: "372d939b-c26e-454a-ab35-dfb3b351bf87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.316898 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/372d939b-c26e-454a-ab35-dfb3b351bf87-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.469245 4775 generic.go:334] "Generic (PLEG): container finished" podID="372d939b-c26e-454a-ab35-dfb3b351bf87" containerID="aa2a2554131103ebca04bbd9babab9cfc4591901ba56b06cb1c0b4e27ee006be" exitCode=0 Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.469309 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gr52q" Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.470097 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr52q" event={"ID":"372d939b-c26e-454a-ab35-dfb3b351bf87","Type":"ContainerDied","Data":"aa2a2554131103ebca04bbd9babab9cfc4591901ba56b06cb1c0b4e27ee006be"} Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.470207 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr52q" event={"ID":"372d939b-c26e-454a-ab35-dfb3b351bf87","Type":"ContainerDied","Data":"f62ca9a0560dc7192e05969fe693ffcaace0ca676d60e1e17d13917f504ff25b"} Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.470252 4775 scope.go:117] "RemoveContainer" containerID="aa2a2554131103ebca04bbd9babab9cfc4591901ba56b06cb1c0b4e27ee006be" Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.499380 4775 scope.go:117] "RemoveContainer" containerID="fba3a2e8c917b24346b16a6e1befbb10815a0d52f35e61aa04c17d467e048dfc" Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.504623 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gr52q"] Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.516688 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gr52q"] Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.545263 4775 scope.go:117] "RemoveContainer" containerID="5f5b68174cc63aad0a59f283641f6490cdfbb98d6c0d8d01e10dc88b5aa03b9e" Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.565169 4775 scope.go:117] "RemoveContainer" containerID="aa2a2554131103ebca04bbd9babab9cfc4591901ba56b06cb1c0b4e27ee006be" Dec 16 16:01:12 crc kubenswrapper[4775]: E1216 16:01:12.565668 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa2a2554131103ebca04bbd9babab9cfc4591901ba56b06cb1c0b4e27ee006be\": container with ID starting with aa2a2554131103ebca04bbd9babab9cfc4591901ba56b06cb1c0b4e27ee006be not found: ID does not exist" containerID="aa2a2554131103ebca04bbd9babab9cfc4591901ba56b06cb1c0b4e27ee006be" Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.565717 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2a2554131103ebca04bbd9babab9cfc4591901ba56b06cb1c0b4e27ee006be"} err="failed to get container status \"aa2a2554131103ebca04bbd9babab9cfc4591901ba56b06cb1c0b4e27ee006be\": rpc error: code = NotFound desc = could not find container \"aa2a2554131103ebca04bbd9babab9cfc4591901ba56b06cb1c0b4e27ee006be\": container with ID starting with aa2a2554131103ebca04bbd9babab9cfc4591901ba56b06cb1c0b4e27ee006be not found: ID does not exist" Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.565751 4775 scope.go:117] "RemoveContainer" containerID="fba3a2e8c917b24346b16a6e1befbb10815a0d52f35e61aa04c17d467e048dfc" Dec 16 16:01:12 crc kubenswrapper[4775]: E1216 16:01:12.568246 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fba3a2e8c917b24346b16a6e1befbb10815a0d52f35e61aa04c17d467e048dfc\": container with ID starting with fba3a2e8c917b24346b16a6e1befbb10815a0d52f35e61aa04c17d467e048dfc not found: ID does not exist" containerID="fba3a2e8c917b24346b16a6e1befbb10815a0d52f35e61aa04c17d467e048dfc" Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.568275 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fba3a2e8c917b24346b16a6e1befbb10815a0d52f35e61aa04c17d467e048dfc"} err="failed to get container status \"fba3a2e8c917b24346b16a6e1befbb10815a0d52f35e61aa04c17d467e048dfc\": rpc error: code = NotFound desc = could not find container \"fba3a2e8c917b24346b16a6e1befbb10815a0d52f35e61aa04c17d467e048dfc\": container with ID starting with fba3a2e8c917b24346b16a6e1befbb10815a0d52f35e61aa04c17d467e048dfc not found: ID does not exist" Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.568296 4775 scope.go:117] "RemoveContainer" containerID="5f5b68174cc63aad0a59f283641f6490cdfbb98d6c0d8d01e10dc88b5aa03b9e" Dec 16 16:01:12 crc kubenswrapper[4775]: E1216 16:01:12.568694 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f5b68174cc63aad0a59f283641f6490cdfbb98d6c0d8d01e10dc88b5aa03b9e\": container with ID starting with 5f5b68174cc63aad0a59f283641f6490cdfbb98d6c0d8d01e10dc88b5aa03b9e not found: ID does not exist" containerID="5f5b68174cc63aad0a59f283641f6490cdfbb98d6c0d8d01e10dc88b5aa03b9e" Dec 16 16:01:12 crc kubenswrapper[4775]: I1216 16:01:12.568760 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5b68174cc63aad0a59f283641f6490cdfbb98d6c0d8d01e10dc88b5aa03b9e"} err="failed to get container status \"5f5b68174cc63aad0a59f283641f6490cdfbb98d6c0d8d01e10dc88b5aa03b9e\": rpc error: code = NotFound desc = could not find container \"5f5b68174cc63aad0a59f283641f6490cdfbb98d6c0d8d01e10dc88b5aa03b9e\": container with ID starting with 5f5b68174cc63aad0a59f283641f6490cdfbb98d6c0d8d01e10dc88b5aa03b9e not found: ID does not exist" Dec 16 16:01:13 crc kubenswrapper[4775]: I1216 16:01:13.348734 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="372d939b-c26e-454a-ab35-dfb3b351bf87" path="/var/lib/kubelet/pods/372d939b-c26e-454a-ab35-dfb3b351bf87/volumes" Dec 16 16:01:15 crc kubenswrapper[4775]: I1216 16:01:15.345877 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:01:15 crc kubenswrapper[4775]: E1216 16:01:15.346628 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:01:27 crc kubenswrapper[4775]: I1216 16:01:27.338119 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:01:27 crc kubenswrapper[4775]: E1216 16:01:27.338869 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:01:42 crc kubenswrapper[4775]: I1216 16:01:42.338166 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:01:42 crc kubenswrapper[4775]: E1216 16:01:42.339465 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:01:54 crc kubenswrapper[4775]: I1216 16:01:54.338653 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:01:54 crc kubenswrapper[4775]: E1216 16:01:54.339329 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:01:58 crc kubenswrapper[4775]: I1216 16:01:58.768803 4775 scope.go:117] "RemoveContainer" containerID="88a013bf84a782a1ce3257866b7eeb7dc2467d78c9ae77f3aea48e98e49ecdf8" Dec 16 16:02:08 crc kubenswrapper[4775]: I1216 16:02:08.338089 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:02:08 crc kubenswrapper[4775]: E1216 16:02:08.339162 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:02:20 crc kubenswrapper[4775]: I1216 16:02:20.338572 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:02:20 crc kubenswrapper[4775]: E1216 16:02:20.339337 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:02:29 crc kubenswrapper[4775]: I1216 16:02:29.703564 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nwqkz/must-gather-2sghs"] Dec 16 16:02:29 crc kubenswrapper[4775]: E1216 16:02:29.705390 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="372d939b-c26e-454a-ab35-dfb3b351bf87" containerName="extract-content" Dec 16 16:02:29 crc kubenswrapper[4775]: I1216 16:02:29.705473 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="372d939b-c26e-454a-ab35-dfb3b351bf87" containerName="extract-content" Dec 16 16:02:29 crc kubenswrapper[4775]: E1216 16:02:29.705532 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="372d939b-c26e-454a-ab35-dfb3b351bf87" containerName="extract-utilities" Dec 16 16:02:29 crc kubenswrapper[4775]: I1216 16:02:29.705596 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="372d939b-c26e-454a-ab35-dfb3b351bf87" containerName="extract-utilities" Dec 16 16:02:29 crc kubenswrapper[4775]: E1216 16:02:29.705655 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e7d3155-afab-4f73-98e3-f2ea11f36050" containerName="keystone-cron" Dec 16 16:02:29 crc kubenswrapper[4775]: I1216 16:02:29.705712 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e7d3155-afab-4f73-98e3-f2ea11f36050" containerName="keystone-cron" Dec 16 16:02:29 crc kubenswrapper[4775]: E1216 16:02:29.705791 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="372d939b-c26e-454a-ab35-dfb3b351bf87" containerName="registry-server" Dec 16 16:02:29 crc kubenswrapper[4775]: I1216 16:02:29.705852 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="372d939b-c26e-454a-ab35-dfb3b351bf87" containerName="registry-server" Dec 16 16:02:29 crc kubenswrapper[4775]: I1216 16:02:29.706108 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="372d939b-c26e-454a-ab35-dfb3b351bf87" containerName="registry-server" Dec 16 16:02:29 crc kubenswrapper[4775]: I1216 16:02:29.706185 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e7d3155-afab-4f73-98e3-f2ea11f36050" containerName="keystone-cron" Dec 16 16:02:29 crc kubenswrapper[4775]: I1216 16:02:29.707159 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwqkz/must-gather-2sghs" Dec 16 16:02:29 crc kubenswrapper[4775]: I1216 16:02:29.709314 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nwqkz"/"openshift-service-ca.crt" Dec 16 16:02:29 crc kubenswrapper[4775]: I1216 16:02:29.709538 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-nwqkz"/"default-dockercfg-tz5bn" Dec 16 16:02:29 crc kubenswrapper[4775]: I1216 16:02:29.711614 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nwqkz"/"kube-root-ca.crt" Dec 16 16:02:29 crc kubenswrapper[4775]: I1216 16:02:29.713787 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nwqkz/must-gather-2sghs"] Dec 16 16:02:29 crc kubenswrapper[4775]: I1216 16:02:29.794230 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/275854a3-0a76-4f6a-9a0a-12605fa0f5b0-must-gather-output\") pod \"must-gather-2sghs\" (UID: \"275854a3-0a76-4f6a-9a0a-12605fa0f5b0\") " pod="openshift-must-gather-nwqkz/must-gather-2sghs" Dec 16 16:02:29 crc kubenswrapper[4775]: I1216 16:02:29.794291 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlcwd\" (UniqueName: \"kubernetes.io/projected/275854a3-0a76-4f6a-9a0a-12605fa0f5b0-kube-api-access-dlcwd\") pod \"must-gather-2sghs\" (UID: \"275854a3-0a76-4f6a-9a0a-12605fa0f5b0\") " pod="openshift-must-gather-nwqkz/must-gather-2sghs" Dec 16 16:02:29 crc kubenswrapper[4775]: I1216 16:02:29.896543 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/275854a3-0a76-4f6a-9a0a-12605fa0f5b0-must-gather-output\") pod \"must-gather-2sghs\" (UID: \"275854a3-0a76-4f6a-9a0a-12605fa0f5b0\") " pod="openshift-must-gather-nwqkz/must-gather-2sghs" Dec 16 16:02:29 crc kubenswrapper[4775]: I1216 16:02:29.896959 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlcwd\" (UniqueName: \"kubernetes.io/projected/275854a3-0a76-4f6a-9a0a-12605fa0f5b0-kube-api-access-dlcwd\") pod \"must-gather-2sghs\" (UID: \"275854a3-0a76-4f6a-9a0a-12605fa0f5b0\") " pod="openshift-must-gather-nwqkz/must-gather-2sghs" Dec 16 16:02:29 crc kubenswrapper[4775]: I1216 16:02:29.897440 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/275854a3-0a76-4f6a-9a0a-12605fa0f5b0-must-gather-output\") pod \"must-gather-2sghs\" (UID: \"275854a3-0a76-4f6a-9a0a-12605fa0f5b0\") " pod="openshift-must-gather-nwqkz/must-gather-2sghs" Dec 16 16:02:29 crc kubenswrapper[4775]: I1216 16:02:29.924765 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlcwd\" (UniqueName: \"kubernetes.io/projected/275854a3-0a76-4f6a-9a0a-12605fa0f5b0-kube-api-access-dlcwd\") pod \"must-gather-2sghs\" (UID: \"275854a3-0a76-4f6a-9a0a-12605fa0f5b0\") " pod="openshift-must-gather-nwqkz/must-gather-2sghs" Dec 16 16:02:30 crc kubenswrapper[4775]: I1216 16:02:30.024610 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwqkz/must-gather-2sghs" Dec 16 16:02:30 crc kubenswrapper[4775]: I1216 16:02:30.470648 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nwqkz/must-gather-2sghs"] Dec 16 16:02:31 crc kubenswrapper[4775]: I1216 16:02:31.187738 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwqkz/must-gather-2sghs" event={"ID":"275854a3-0a76-4f6a-9a0a-12605fa0f5b0","Type":"ContainerStarted","Data":"689de27950b1438e5663571c1ee83fde62ec541677dfd8314fcc333f2b83ae53"} Dec 16 16:02:31 crc kubenswrapper[4775]: I1216 16:02:31.188080 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwqkz/must-gather-2sghs" event={"ID":"275854a3-0a76-4f6a-9a0a-12605fa0f5b0","Type":"ContainerStarted","Data":"13ec9540eb0ae3f93edade8c3a2116e4309915db64cd03de4e007df2b2dab524"} Dec 16 16:02:32 crc kubenswrapper[4775]: I1216 16:02:32.202322 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwqkz/must-gather-2sghs" event={"ID":"275854a3-0a76-4f6a-9a0a-12605fa0f5b0","Type":"ContainerStarted","Data":"21352973595c7377da49851b3f90ba5ad19981ae1c5fbbf6fa393a2e213be8fc"} Dec 16 16:02:32 crc kubenswrapper[4775]: I1216 16:02:32.233187 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nwqkz/must-gather-2sghs" podStartSLOduration=3.233166105 podStartE2EDuration="3.233166105s" podCreationTimestamp="2025-12-16 16:02:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 16:02:32.220804796 +0000 UTC m=+4077.171883759" watchObservedRunningTime="2025-12-16 16:02:32.233166105 +0000 UTC m=+4077.184245038" Dec 16 16:02:33 crc kubenswrapper[4775]: I1216 16:02:33.339278 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:02:33 crc kubenswrapper[4775]: E1216 16:02:33.339825 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:02:34 crc kubenswrapper[4775]: E1216 16:02:34.439996 4775 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.223:53228->38.102.83.223:37315: read tcp 38.102.83.223:53228->38.102.83.223:37315: read: connection reset by peer Dec 16 16:02:35 crc kubenswrapper[4775]: I1216 16:02:35.351953 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nwqkz/crc-debug-dg5wz"] Dec 16 16:02:35 crc kubenswrapper[4775]: I1216 16:02:35.353065 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwqkz/crc-debug-dg5wz" Dec 16 16:02:35 crc kubenswrapper[4775]: I1216 16:02:35.451939 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b1bbb7c-c389-476f-a22c-84b7585a1648-host\") pod \"crc-debug-dg5wz\" (UID: \"1b1bbb7c-c389-476f-a22c-84b7585a1648\") " pod="openshift-must-gather-nwqkz/crc-debug-dg5wz" Dec 16 16:02:35 crc kubenswrapper[4775]: I1216 16:02:35.452109 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g5s4\" (UniqueName: \"kubernetes.io/projected/1b1bbb7c-c389-476f-a22c-84b7585a1648-kube-api-access-8g5s4\") pod \"crc-debug-dg5wz\" (UID: \"1b1bbb7c-c389-476f-a22c-84b7585a1648\") " pod="openshift-must-gather-nwqkz/crc-debug-dg5wz" Dec 16 16:02:35 crc kubenswrapper[4775]: I1216 16:02:35.565521 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g5s4\" (UniqueName: \"kubernetes.io/projected/1b1bbb7c-c389-476f-a22c-84b7585a1648-kube-api-access-8g5s4\") pod \"crc-debug-dg5wz\" (UID: \"1b1bbb7c-c389-476f-a22c-84b7585a1648\") " pod="openshift-must-gather-nwqkz/crc-debug-dg5wz" Dec 16 16:02:35 crc kubenswrapper[4775]: I1216 16:02:35.566100 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b1bbb7c-c389-476f-a22c-84b7585a1648-host\") pod \"crc-debug-dg5wz\" (UID: \"1b1bbb7c-c389-476f-a22c-84b7585a1648\") " pod="openshift-must-gather-nwqkz/crc-debug-dg5wz" Dec 16 16:02:35 crc kubenswrapper[4775]: I1216 16:02:35.566359 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b1bbb7c-c389-476f-a22c-84b7585a1648-host\") pod \"crc-debug-dg5wz\" (UID: \"1b1bbb7c-c389-476f-a22c-84b7585a1648\") " pod="openshift-must-gather-nwqkz/crc-debug-dg5wz" Dec 16 16:02:35 crc kubenswrapper[4775]: I1216 16:02:35.899302 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g5s4\" (UniqueName: \"kubernetes.io/projected/1b1bbb7c-c389-476f-a22c-84b7585a1648-kube-api-access-8g5s4\") pod \"crc-debug-dg5wz\" (UID: \"1b1bbb7c-c389-476f-a22c-84b7585a1648\") " pod="openshift-must-gather-nwqkz/crc-debug-dg5wz" Dec 16 16:02:35 crc kubenswrapper[4775]: I1216 16:02:35.973810 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwqkz/crc-debug-dg5wz" Dec 16 16:02:36 crc kubenswrapper[4775]: W1216 16:02:36.011061 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b1bbb7c_c389_476f_a22c_84b7585a1648.slice/crio-f8469b68a924234ad90b960e65585879bee55782bfe8ffe2f3f60ec6da6699da WatchSource:0}: Error finding container f8469b68a924234ad90b960e65585879bee55782bfe8ffe2f3f60ec6da6699da: Status 404 returned error can't find the container with id f8469b68a924234ad90b960e65585879bee55782bfe8ffe2f3f60ec6da6699da Dec 16 16:02:36 crc kubenswrapper[4775]: I1216 16:02:36.241314 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwqkz/crc-debug-dg5wz" event={"ID":"1b1bbb7c-c389-476f-a22c-84b7585a1648","Type":"ContainerStarted","Data":"f8469b68a924234ad90b960e65585879bee55782bfe8ffe2f3f60ec6da6699da"} Dec 16 16:02:37 crc kubenswrapper[4775]: I1216 16:02:37.254449 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwqkz/crc-debug-dg5wz" event={"ID":"1b1bbb7c-c389-476f-a22c-84b7585a1648","Type":"ContainerStarted","Data":"2099b23efb91bfc085d3d9717ddc8652222a21fa491a779ce1bcf3b96a0d6ff7"} Dec 16 16:02:37 crc kubenswrapper[4775]: I1216 16:02:37.277029 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nwqkz/crc-debug-dg5wz" podStartSLOduration=2.277011955 podStartE2EDuration="2.277011955s" podCreationTimestamp="2025-12-16 16:02:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 16:02:37.26728387 +0000 UTC m=+4082.218362803" watchObservedRunningTime="2025-12-16 16:02:37.277011955 +0000 UTC m=+4082.228090878" Dec 16 16:02:47 crc kubenswrapper[4775]: I1216 16:02:47.338135 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:02:47 crc kubenswrapper[4775]: E1216 16:02:47.338981 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:03:02 crc kubenswrapper[4775]: I1216 16:03:02.337433 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:03:02 crc kubenswrapper[4775]: E1216 16:03:02.338108 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:03:16 crc kubenswrapper[4775]: I1216 16:03:16.338077 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:03:16 crc kubenswrapper[4775]: E1216 16:03:16.340574 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:03:18 crc kubenswrapper[4775]: I1216 16:03:18.062976 4775 generic.go:334] "Generic (PLEG): container finished" podID="1b1bbb7c-c389-476f-a22c-84b7585a1648" containerID="2099b23efb91bfc085d3d9717ddc8652222a21fa491a779ce1bcf3b96a0d6ff7" exitCode=0 Dec 16 16:03:18 crc kubenswrapper[4775]: I1216 16:03:18.063065 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwqkz/crc-debug-dg5wz" event={"ID":"1b1bbb7c-c389-476f-a22c-84b7585a1648","Type":"ContainerDied","Data":"2099b23efb91bfc085d3d9717ddc8652222a21fa491a779ce1bcf3b96a0d6ff7"} Dec 16 16:03:19 crc kubenswrapper[4775]: I1216 16:03:19.179189 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwqkz/crc-debug-dg5wz" Dec 16 16:03:19 crc kubenswrapper[4775]: I1216 16:03:19.216772 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nwqkz/crc-debug-dg5wz"] Dec 16 16:03:19 crc kubenswrapper[4775]: I1216 16:03:19.225225 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nwqkz/crc-debug-dg5wz"] Dec 16 16:03:19 crc kubenswrapper[4775]: I1216 16:03:19.298645 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b1bbb7c-c389-476f-a22c-84b7585a1648-host\") pod \"1b1bbb7c-c389-476f-a22c-84b7585a1648\" (UID: \"1b1bbb7c-c389-476f-a22c-84b7585a1648\") " Dec 16 16:03:19 crc kubenswrapper[4775]: I1216 16:03:19.298717 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g5s4\" (UniqueName: \"kubernetes.io/projected/1b1bbb7c-c389-476f-a22c-84b7585a1648-kube-api-access-8g5s4\") pod \"1b1bbb7c-c389-476f-a22c-84b7585a1648\" (UID: \"1b1bbb7c-c389-476f-a22c-84b7585a1648\") " Dec 16 16:03:19 crc kubenswrapper[4775]: I1216 16:03:19.300236 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b1bbb7c-c389-476f-a22c-84b7585a1648-host" (OuterVolumeSpecName: "host") pod "1b1bbb7c-c389-476f-a22c-84b7585a1648" (UID: "1b1bbb7c-c389-476f-a22c-84b7585a1648"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 16:03:19 crc kubenswrapper[4775]: I1216 16:03:19.305940 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b1bbb7c-c389-476f-a22c-84b7585a1648-kube-api-access-8g5s4" (OuterVolumeSpecName: "kube-api-access-8g5s4") pod "1b1bbb7c-c389-476f-a22c-84b7585a1648" (UID: "1b1bbb7c-c389-476f-a22c-84b7585a1648"). InnerVolumeSpecName "kube-api-access-8g5s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:03:19 crc kubenswrapper[4775]: I1216 16:03:19.350977 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b1bbb7c-c389-476f-a22c-84b7585a1648" path="/var/lib/kubelet/pods/1b1bbb7c-c389-476f-a22c-84b7585a1648/volumes" Dec 16 16:03:19 crc kubenswrapper[4775]: I1216 16:03:19.400779 4775 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b1bbb7c-c389-476f-a22c-84b7585a1648-host\") on node \"crc\" DevicePath \"\"" Dec 16 16:03:19 crc kubenswrapper[4775]: I1216 16:03:19.400815 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g5s4\" (UniqueName: \"kubernetes.io/projected/1b1bbb7c-c389-476f-a22c-84b7585a1648-kube-api-access-8g5s4\") on node \"crc\" DevicePath \"\"" Dec 16 16:03:20 crc kubenswrapper[4775]: I1216 16:03:20.082035 4775 scope.go:117] "RemoveContainer" containerID="2099b23efb91bfc085d3d9717ddc8652222a21fa491a779ce1bcf3b96a0d6ff7" Dec 16 16:03:20 crc kubenswrapper[4775]: I1216 16:03:20.082140 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwqkz/crc-debug-dg5wz" Dec 16 16:03:20 crc kubenswrapper[4775]: I1216 16:03:20.386751 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nwqkz/crc-debug-ms6j7"] Dec 16 16:03:20 crc kubenswrapper[4775]: E1216 16:03:20.387228 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b1bbb7c-c389-476f-a22c-84b7585a1648" containerName="container-00" Dec 16 16:03:20 crc kubenswrapper[4775]: I1216 16:03:20.387243 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b1bbb7c-c389-476f-a22c-84b7585a1648" containerName="container-00" Dec 16 16:03:20 crc kubenswrapper[4775]: I1216 16:03:20.387475 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b1bbb7c-c389-476f-a22c-84b7585a1648" containerName="container-00" Dec 16 16:03:20 crc kubenswrapper[4775]: I1216 16:03:20.388737 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwqkz/crc-debug-ms6j7" Dec 16 16:03:20 crc kubenswrapper[4775]: I1216 16:03:20.420345 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0dcbd0cd-e686-4a43-a719-113a7151cf56-host\") pod \"crc-debug-ms6j7\" (UID: \"0dcbd0cd-e686-4a43-a719-113a7151cf56\") " pod="openshift-must-gather-nwqkz/crc-debug-ms6j7" Dec 16 16:03:20 crc kubenswrapper[4775]: I1216 16:03:20.420432 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-959r8\" (UniqueName: \"kubernetes.io/projected/0dcbd0cd-e686-4a43-a719-113a7151cf56-kube-api-access-959r8\") pod \"crc-debug-ms6j7\" (UID: \"0dcbd0cd-e686-4a43-a719-113a7151cf56\") " pod="openshift-must-gather-nwqkz/crc-debug-ms6j7" Dec 16 16:03:20 crc kubenswrapper[4775]: I1216 16:03:20.521130 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0dcbd0cd-e686-4a43-a719-113a7151cf56-host\") pod \"crc-debug-ms6j7\" (UID: \"0dcbd0cd-e686-4a43-a719-113a7151cf56\") " pod="openshift-must-gather-nwqkz/crc-debug-ms6j7" Dec 16 16:03:20 crc kubenswrapper[4775]: I1216 16:03:20.521219 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-959r8\" (UniqueName: \"kubernetes.io/projected/0dcbd0cd-e686-4a43-a719-113a7151cf56-kube-api-access-959r8\") pod \"crc-debug-ms6j7\" (UID: \"0dcbd0cd-e686-4a43-a719-113a7151cf56\") " pod="openshift-must-gather-nwqkz/crc-debug-ms6j7" Dec 16 16:03:20 crc kubenswrapper[4775]: I1216 16:03:20.521279 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0dcbd0cd-e686-4a43-a719-113a7151cf56-host\") pod \"crc-debug-ms6j7\" (UID: \"0dcbd0cd-e686-4a43-a719-113a7151cf56\") " pod="openshift-must-gather-nwqkz/crc-debug-ms6j7" Dec 16 16:03:20 crc kubenswrapper[4775]: I1216 16:03:20.537880 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-959r8\" (UniqueName: \"kubernetes.io/projected/0dcbd0cd-e686-4a43-a719-113a7151cf56-kube-api-access-959r8\") pod \"crc-debug-ms6j7\" (UID: \"0dcbd0cd-e686-4a43-a719-113a7151cf56\") " pod="openshift-must-gather-nwqkz/crc-debug-ms6j7" Dec 16 16:03:20 crc kubenswrapper[4775]: I1216 16:03:20.705815 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwqkz/crc-debug-ms6j7" Dec 16 16:03:21 crc kubenswrapper[4775]: I1216 16:03:21.096436 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwqkz/crc-debug-ms6j7" event={"ID":"0dcbd0cd-e686-4a43-a719-113a7151cf56","Type":"ContainerStarted","Data":"a4942c5c28915fe4fb72317971898716d6506f88bf6bc88a056e4f0e0231b1ad"} Dec 16 16:03:21 crc kubenswrapper[4775]: I1216 16:03:21.096484 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwqkz/crc-debug-ms6j7" event={"ID":"0dcbd0cd-e686-4a43-a719-113a7151cf56","Type":"ContainerStarted","Data":"5fae4e6d59dd3d3853f9188241b1920aacdaffcef902d89c8e73e26ef93e8014"} Dec 16 16:03:22 crc kubenswrapper[4775]: I1216 16:03:22.113608 4775 generic.go:334] "Generic (PLEG): container finished" podID="0dcbd0cd-e686-4a43-a719-113a7151cf56" containerID="a4942c5c28915fe4fb72317971898716d6506f88bf6bc88a056e4f0e0231b1ad" exitCode=0 Dec 16 16:03:22 crc kubenswrapper[4775]: I1216 16:03:22.113678 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwqkz/crc-debug-ms6j7" event={"ID":"0dcbd0cd-e686-4a43-a719-113a7151cf56","Type":"ContainerDied","Data":"a4942c5c28915fe4fb72317971898716d6506f88bf6bc88a056e4f0e0231b1ad"} Dec 16 16:03:22 crc kubenswrapper[4775]: I1216 16:03:22.218217 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwqkz/crc-debug-ms6j7" Dec 16 16:03:22 crc kubenswrapper[4775]: I1216 16:03:22.244370 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0dcbd0cd-e686-4a43-a719-113a7151cf56-host\") pod \"0dcbd0cd-e686-4a43-a719-113a7151cf56\" (UID: \"0dcbd0cd-e686-4a43-a719-113a7151cf56\") " Dec 16 16:03:22 crc kubenswrapper[4775]: I1216 16:03:22.244500 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0dcbd0cd-e686-4a43-a719-113a7151cf56-host" (OuterVolumeSpecName: "host") pod "0dcbd0cd-e686-4a43-a719-113a7151cf56" (UID: "0dcbd0cd-e686-4a43-a719-113a7151cf56"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 16:03:22 crc kubenswrapper[4775]: I1216 16:03:22.244631 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-959r8\" (UniqueName: \"kubernetes.io/projected/0dcbd0cd-e686-4a43-a719-113a7151cf56-kube-api-access-959r8\") pod \"0dcbd0cd-e686-4a43-a719-113a7151cf56\" (UID: \"0dcbd0cd-e686-4a43-a719-113a7151cf56\") " Dec 16 16:03:22 crc kubenswrapper[4775]: I1216 16:03:22.245025 4775 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0dcbd0cd-e686-4a43-a719-113a7151cf56-host\") on node \"crc\" DevicePath \"\"" Dec 16 16:03:22 crc kubenswrapper[4775]: I1216 16:03:22.251472 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dcbd0cd-e686-4a43-a719-113a7151cf56-kube-api-access-959r8" (OuterVolumeSpecName: "kube-api-access-959r8") pod "0dcbd0cd-e686-4a43-a719-113a7151cf56" (UID: "0dcbd0cd-e686-4a43-a719-113a7151cf56"). InnerVolumeSpecName "kube-api-access-959r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:03:22 crc kubenswrapper[4775]: I1216 16:03:22.346227 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-959r8\" (UniqueName: \"kubernetes.io/projected/0dcbd0cd-e686-4a43-a719-113a7151cf56-kube-api-access-959r8\") on node \"crc\" DevicePath \"\"" Dec 16 16:03:22 crc kubenswrapper[4775]: I1216 16:03:22.845957 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nwqkz/crc-debug-ms6j7"] Dec 16 16:03:22 crc kubenswrapper[4775]: I1216 16:03:22.853831 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nwqkz/crc-debug-ms6j7"] Dec 16 16:03:23 crc kubenswrapper[4775]: I1216 16:03:23.124871 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fae4e6d59dd3d3853f9188241b1920aacdaffcef902d89c8e73e26ef93e8014" Dec 16 16:03:23 crc kubenswrapper[4775]: I1216 16:03:23.125025 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwqkz/crc-debug-ms6j7" Dec 16 16:03:23 crc kubenswrapper[4775]: I1216 16:03:23.348273 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dcbd0cd-e686-4a43-a719-113a7151cf56" path="/var/lib/kubelet/pods/0dcbd0cd-e686-4a43-a719-113a7151cf56/volumes" Dec 16 16:03:24 crc kubenswrapper[4775]: I1216 16:03:24.006417 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nwqkz/crc-debug-l59fq"] Dec 16 16:03:24 crc kubenswrapper[4775]: E1216 16:03:24.007213 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dcbd0cd-e686-4a43-a719-113a7151cf56" containerName="container-00" Dec 16 16:03:24 crc kubenswrapper[4775]: I1216 16:03:24.007239 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dcbd0cd-e686-4a43-a719-113a7151cf56" containerName="container-00" Dec 16 16:03:24 crc kubenswrapper[4775]: I1216 16:03:24.007475 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dcbd0cd-e686-4a43-a719-113a7151cf56" containerName="container-00" Dec 16 16:03:24 crc kubenswrapper[4775]: I1216 16:03:24.008307 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwqkz/crc-debug-l59fq" Dec 16 16:03:24 crc kubenswrapper[4775]: I1216 16:03:24.174867 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c77f21b0-8568-4d04-b02d-4781cfc5fbf4-host\") pod \"crc-debug-l59fq\" (UID: \"c77f21b0-8568-4d04-b02d-4781cfc5fbf4\") " pod="openshift-must-gather-nwqkz/crc-debug-l59fq" Dec 16 16:03:24 crc kubenswrapper[4775]: I1216 16:03:24.174976 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hpkq\" (UniqueName: \"kubernetes.io/projected/c77f21b0-8568-4d04-b02d-4781cfc5fbf4-kube-api-access-4hpkq\") pod \"crc-debug-l59fq\" (UID: \"c77f21b0-8568-4d04-b02d-4781cfc5fbf4\") " pod="openshift-must-gather-nwqkz/crc-debug-l59fq" Dec 16 16:03:24 crc kubenswrapper[4775]: I1216 16:03:24.276739 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c77f21b0-8568-4d04-b02d-4781cfc5fbf4-host\") pod \"crc-debug-l59fq\" (UID: \"c77f21b0-8568-4d04-b02d-4781cfc5fbf4\") " pod="openshift-must-gather-nwqkz/crc-debug-l59fq" Dec 16 16:03:24 crc kubenswrapper[4775]: I1216 16:03:24.276823 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hpkq\" (UniqueName: \"kubernetes.io/projected/c77f21b0-8568-4d04-b02d-4781cfc5fbf4-kube-api-access-4hpkq\") pod \"crc-debug-l59fq\" (UID: \"c77f21b0-8568-4d04-b02d-4781cfc5fbf4\") " pod="openshift-must-gather-nwqkz/crc-debug-l59fq" Dec 16 16:03:24 crc kubenswrapper[4775]: I1216 16:03:24.276851 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c77f21b0-8568-4d04-b02d-4781cfc5fbf4-host\") pod \"crc-debug-l59fq\" (UID: \"c77f21b0-8568-4d04-b02d-4781cfc5fbf4\") " pod="openshift-must-gather-nwqkz/crc-debug-l59fq" Dec 16 16:03:24 crc kubenswrapper[4775]: I1216 16:03:24.409540 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hpkq\" (UniqueName: \"kubernetes.io/projected/c77f21b0-8568-4d04-b02d-4781cfc5fbf4-kube-api-access-4hpkq\") pod \"crc-debug-l59fq\" (UID: \"c77f21b0-8568-4d04-b02d-4781cfc5fbf4\") " pod="openshift-must-gather-nwqkz/crc-debug-l59fq" Dec 16 16:03:24 crc kubenswrapper[4775]: I1216 16:03:24.632533 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwqkz/crc-debug-l59fq" Dec 16 16:03:25 crc kubenswrapper[4775]: I1216 16:03:25.142223 4775 generic.go:334] "Generic (PLEG): container finished" podID="c77f21b0-8568-4d04-b02d-4781cfc5fbf4" containerID="75b651002ee8821bebb48059e9a59de04ca30116a981f044beb55a6bb4f5cfe1" exitCode=0 Dec 16 16:03:25 crc kubenswrapper[4775]: I1216 16:03:25.142383 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwqkz/crc-debug-l59fq" event={"ID":"c77f21b0-8568-4d04-b02d-4781cfc5fbf4","Type":"ContainerDied","Data":"75b651002ee8821bebb48059e9a59de04ca30116a981f044beb55a6bb4f5cfe1"} Dec 16 16:03:25 crc kubenswrapper[4775]: I1216 16:03:25.142539 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwqkz/crc-debug-l59fq" event={"ID":"c77f21b0-8568-4d04-b02d-4781cfc5fbf4","Type":"ContainerStarted","Data":"0b5dc3eb434d5eaad59b1e11e2fc538d4070bd1af66b78ef0d6d595fb349cdfb"} Dec 16 16:03:25 crc kubenswrapper[4775]: I1216 16:03:25.180198 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nwqkz/crc-debug-l59fq"] Dec 16 16:03:25 crc kubenswrapper[4775]: I1216 16:03:25.188677 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nwqkz/crc-debug-l59fq"] Dec 16 16:03:26 crc kubenswrapper[4775]: I1216 16:03:26.265158 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwqkz/crc-debug-l59fq" Dec 16 16:03:26 crc kubenswrapper[4775]: I1216 16:03:26.317793 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c77f21b0-8568-4d04-b02d-4781cfc5fbf4-host\") pod \"c77f21b0-8568-4d04-b02d-4781cfc5fbf4\" (UID: \"c77f21b0-8568-4d04-b02d-4781cfc5fbf4\") " Dec 16 16:03:26 crc kubenswrapper[4775]: I1216 16:03:26.317878 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hpkq\" (UniqueName: \"kubernetes.io/projected/c77f21b0-8568-4d04-b02d-4781cfc5fbf4-kube-api-access-4hpkq\") pod \"c77f21b0-8568-4d04-b02d-4781cfc5fbf4\" (UID: \"c77f21b0-8568-4d04-b02d-4781cfc5fbf4\") " Dec 16 16:03:26 crc kubenswrapper[4775]: I1216 16:03:26.317928 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c77f21b0-8568-4d04-b02d-4781cfc5fbf4-host" (OuterVolumeSpecName: "host") pod "c77f21b0-8568-4d04-b02d-4781cfc5fbf4" (UID: "c77f21b0-8568-4d04-b02d-4781cfc5fbf4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 16:03:26 crc kubenswrapper[4775]: I1216 16:03:26.318195 4775 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c77f21b0-8568-4d04-b02d-4781cfc5fbf4-host\") on node \"crc\" DevicePath \"\"" Dec 16 16:03:26 crc kubenswrapper[4775]: I1216 16:03:26.327483 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c77f21b0-8568-4d04-b02d-4781cfc5fbf4-kube-api-access-4hpkq" (OuterVolumeSpecName: "kube-api-access-4hpkq") pod "c77f21b0-8568-4d04-b02d-4781cfc5fbf4" (UID: "c77f21b0-8568-4d04-b02d-4781cfc5fbf4"). InnerVolumeSpecName "kube-api-access-4hpkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:03:26 crc kubenswrapper[4775]: I1216 16:03:26.419395 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hpkq\" (UniqueName: \"kubernetes.io/projected/c77f21b0-8568-4d04-b02d-4781cfc5fbf4-kube-api-access-4hpkq\") on node \"crc\" DevicePath \"\"" Dec 16 16:03:27 crc kubenswrapper[4775]: I1216 16:03:27.161572 4775 scope.go:117] "RemoveContainer" containerID="75b651002ee8821bebb48059e9a59de04ca30116a981f044beb55a6bb4f5cfe1" Dec 16 16:03:27 crc kubenswrapper[4775]: I1216 16:03:27.161626 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwqkz/crc-debug-l59fq" Dec 16 16:03:27 crc kubenswrapper[4775]: I1216 16:03:27.348099 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c77f21b0-8568-4d04-b02d-4781cfc5fbf4" path="/var/lib/kubelet/pods/c77f21b0-8568-4d04-b02d-4781cfc5fbf4/volumes" Dec 16 16:03:28 crc kubenswrapper[4775]: I1216 16:03:28.338547 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:03:28 crc kubenswrapper[4775]: E1216 16:03:28.339133 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:03:35 crc kubenswrapper[4775]: I1216 16:03:35.647479 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2d7kz"] Dec 16 16:03:35 crc kubenswrapper[4775]: E1216 16:03:35.655077 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c77f21b0-8568-4d04-b02d-4781cfc5fbf4" containerName="container-00" Dec 16 16:03:35 crc kubenswrapper[4775]: I1216 16:03:35.655325 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77f21b0-8568-4d04-b02d-4781cfc5fbf4" containerName="container-00" Dec 16 16:03:35 crc kubenswrapper[4775]: I1216 16:03:35.655575 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c77f21b0-8568-4d04-b02d-4781cfc5fbf4" containerName="container-00" Dec 16 16:03:35 crc kubenswrapper[4775]: I1216 16:03:35.657178 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2d7kz" Dec 16 16:03:35 crc kubenswrapper[4775]: I1216 16:03:35.660637 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2d7kz"] Dec 16 16:03:35 crc kubenswrapper[4775]: I1216 16:03:35.790199 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/071fb55f-063a-421f-9431-a806fcc00b6a-catalog-content\") pod \"redhat-marketplace-2d7kz\" (UID: \"071fb55f-063a-421f-9431-a806fcc00b6a\") " pod="openshift-marketplace/redhat-marketplace-2d7kz" Dec 16 16:03:35 crc kubenswrapper[4775]: I1216 16:03:35.790309 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9qk6\" (UniqueName: \"kubernetes.io/projected/071fb55f-063a-421f-9431-a806fcc00b6a-kube-api-access-x9qk6\") pod \"redhat-marketplace-2d7kz\" (UID: \"071fb55f-063a-421f-9431-a806fcc00b6a\") " pod="openshift-marketplace/redhat-marketplace-2d7kz" Dec 16 16:03:35 crc kubenswrapper[4775]: I1216 16:03:35.790335 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/071fb55f-063a-421f-9431-a806fcc00b6a-utilities\") pod \"redhat-marketplace-2d7kz\" (UID: \"071fb55f-063a-421f-9431-a806fcc00b6a\") " pod="openshift-marketplace/redhat-marketplace-2d7kz" Dec 16 16:03:35 crc kubenswrapper[4775]: I1216 16:03:35.892358 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9qk6\" (UniqueName: \"kubernetes.io/projected/071fb55f-063a-421f-9431-a806fcc00b6a-kube-api-access-x9qk6\") pod \"redhat-marketplace-2d7kz\" (UID: \"071fb55f-063a-421f-9431-a806fcc00b6a\") " pod="openshift-marketplace/redhat-marketplace-2d7kz" Dec 16 16:03:35 crc kubenswrapper[4775]: I1216 16:03:35.892733 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/071fb55f-063a-421f-9431-a806fcc00b6a-utilities\") pod \"redhat-marketplace-2d7kz\" (UID: \"071fb55f-063a-421f-9431-a806fcc00b6a\") " pod="openshift-marketplace/redhat-marketplace-2d7kz" Dec 16 16:03:35 crc kubenswrapper[4775]: I1216 16:03:35.893156 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/071fb55f-063a-421f-9431-a806fcc00b6a-utilities\") pod \"redhat-marketplace-2d7kz\" (UID: \"071fb55f-063a-421f-9431-a806fcc00b6a\") " pod="openshift-marketplace/redhat-marketplace-2d7kz" Dec 16 16:03:35 crc kubenswrapper[4775]: I1216 16:03:35.893263 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/071fb55f-063a-421f-9431-a806fcc00b6a-catalog-content\") pod \"redhat-marketplace-2d7kz\" (UID: \"071fb55f-063a-421f-9431-a806fcc00b6a\") " pod="openshift-marketplace/redhat-marketplace-2d7kz" Dec 16 16:03:35 crc kubenswrapper[4775]: I1216 16:03:35.893544 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/071fb55f-063a-421f-9431-a806fcc00b6a-catalog-content\") pod \"redhat-marketplace-2d7kz\" (UID: \"071fb55f-063a-421f-9431-a806fcc00b6a\") " pod="openshift-marketplace/redhat-marketplace-2d7kz" Dec 16 16:03:35 crc kubenswrapper[4775]: I1216 16:03:35.915680 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9qk6\" (UniqueName: \"kubernetes.io/projected/071fb55f-063a-421f-9431-a806fcc00b6a-kube-api-access-x9qk6\") pod \"redhat-marketplace-2d7kz\" (UID: \"071fb55f-063a-421f-9431-a806fcc00b6a\") " pod="openshift-marketplace/redhat-marketplace-2d7kz" Dec 16 16:03:35 crc kubenswrapper[4775]: I1216 16:03:35.975010 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2d7kz" Dec 16 16:03:36 crc kubenswrapper[4775]: I1216 16:03:36.450761 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2d7kz"] Dec 16 16:03:37 crc kubenswrapper[4775]: I1216 16:03:37.251544 4775 generic.go:334] "Generic (PLEG): container finished" podID="071fb55f-063a-421f-9431-a806fcc00b6a" containerID="f59618084b9e605bc165d219885091b20b9832de193c37006f3c999d1e0180a4" exitCode=0 Dec 16 16:03:37 crc kubenswrapper[4775]: I1216 16:03:37.251633 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d7kz" event={"ID":"071fb55f-063a-421f-9431-a806fcc00b6a","Type":"ContainerDied","Data":"f59618084b9e605bc165d219885091b20b9832de193c37006f3c999d1e0180a4"} Dec 16 16:03:37 crc kubenswrapper[4775]: I1216 16:03:37.251878 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d7kz" event={"ID":"071fb55f-063a-421f-9431-a806fcc00b6a","Type":"ContainerStarted","Data":"6d98ae62a94733d1a2cdbf891877c4209e0d29936896669ff77785d3a9cc2ff2"} Dec 16 16:03:38 crc kubenswrapper[4775]: I1216 16:03:38.277503 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d7kz" event={"ID":"071fb55f-063a-421f-9431-a806fcc00b6a","Type":"ContainerStarted","Data":"924857a14eda5f592055e638d9e7eaadc3d33b870888ce4e7969061feaaa22b2"} Dec 16 16:03:39 crc kubenswrapper[4775]: I1216 16:03:39.288208 4775 generic.go:334] "Generic (PLEG): container finished" podID="071fb55f-063a-421f-9431-a806fcc00b6a" containerID="924857a14eda5f592055e638d9e7eaadc3d33b870888ce4e7969061feaaa22b2" exitCode=0 Dec 16 16:03:39 crc kubenswrapper[4775]: I1216 16:03:39.288268 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d7kz" event={"ID":"071fb55f-063a-421f-9431-a806fcc00b6a","Type":"ContainerDied","Data":"924857a14eda5f592055e638d9e7eaadc3d33b870888ce4e7969061feaaa22b2"} Dec 16 16:03:40 crc kubenswrapper[4775]: I1216 16:03:40.298722 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d7kz" event={"ID":"071fb55f-063a-421f-9431-a806fcc00b6a","Type":"ContainerStarted","Data":"5ca5a6d21939a11154ec1b0b2a4240d2e0cb932977d5bbdc5f020f1d736cc9eb"} Dec 16 16:03:40 crc kubenswrapper[4775]: I1216 16:03:40.320262 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2d7kz" podStartSLOduration=2.7670080710000002 podStartE2EDuration="5.320241636s" podCreationTimestamp="2025-12-16 16:03:35 +0000 UTC" firstStartedPulling="2025-12-16 16:03:37.25337019 +0000 UTC m=+4142.204449103" lastFinishedPulling="2025-12-16 16:03:39.806603745 +0000 UTC m=+4144.757682668" observedRunningTime="2025-12-16 16:03:40.314413942 +0000 UTC m=+4145.265492875" watchObservedRunningTime="2025-12-16 16:03:40.320241636 +0000 UTC m=+4145.271320559" Dec 16 16:03:42 crc kubenswrapper[4775]: I1216 16:03:42.337488 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:03:43 crc kubenswrapper[4775]: I1216 16:03:43.329286 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerStarted","Data":"463c218c74a81c197a6e325e9e6e91831b9764405b5f63ce630e6f2e7837e133"} Dec 16 16:03:45 crc kubenswrapper[4775]: I1216 16:03:45.976649 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2d7kz" Dec 16 16:03:45 crc kubenswrapper[4775]: I1216 16:03:45.977084 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2d7kz" Dec 16 16:03:46 crc kubenswrapper[4775]: I1216 16:03:46.024352 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2d7kz" Dec 16 16:03:46 crc kubenswrapper[4775]: I1216 16:03:46.409528 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2d7kz" Dec 16 16:03:46 crc kubenswrapper[4775]: I1216 16:03:46.460027 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2d7kz"] Dec 16 16:03:46 crc kubenswrapper[4775]: I1216 16:03:46.981491 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5cdf7cd694-pv7bs_38f1f660-5367-4db0-a653-c72807682175/barbican-api/0.log" Dec 16 16:03:47 crc kubenswrapper[4775]: I1216 16:03:47.182175 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5cdf7cd694-pv7bs_38f1f660-5367-4db0-a653-c72807682175/barbican-api-log/0.log" Dec 16 16:03:47 crc kubenswrapper[4775]: I1216 16:03:47.185262 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c8466bf58-6vkrk_ee4d6d93-229d-499b-8121-123db79d7758/barbican-keystone-listener/0.log" Dec 16 16:03:47 crc kubenswrapper[4775]: I1216 16:03:47.338983 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c8466bf58-6vkrk_ee4d6d93-229d-499b-8121-123db79d7758/barbican-keystone-listener-log/0.log" Dec 16 16:03:47 crc kubenswrapper[4775]: I1216 16:03:47.409655 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d485fb59c-wh26t_1035b49f-bf1a-44ee-9cd4-01df93145086/barbican-worker/0.log" Dec 16 16:03:47 crc kubenswrapper[4775]: I1216 16:03:47.438595 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d485fb59c-wh26t_1035b49f-bf1a-44ee-9cd4-01df93145086/barbican-worker-log/0.log" Dec 16 16:03:47 crc kubenswrapper[4775]: I1216 16:03:47.641935 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-jg95f_2b2d1ae7-ec42-4c6c-9400-966f2093d883/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:03:47 crc kubenswrapper[4775]: I1216 16:03:47.846111 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81d44d7a-71b0-40da-b940-ccdb6d63b4f9/ceilometer-central-agent/0.log" Dec 16 16:03:47 crc kubenswrapper[4775]: I1216 16:03:47.995317 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81d44d7a-71b0-40da-b940-ccdb6d63b4f9/proxy-httpd/0.log" Dec 16 16:03:48 crc kubenswrapper[4775]: I1216 16:03:48.006779 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81d44d7a-71b0-40da-b940-ccdb6d63b4f9/ceilometer-notification-agent/0.log" Dec 16 16:03:48 crc kubenswrapper[4775]: I1216 16:03:48.033172 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81d44d7a-71b0-40da-b940-ccdb6d63b4f9/sg-core/0.log" Dec 16 16:03:48 crc kubenswrapper[4775]: I1216 16:03:48.207622 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_521e4d08-bfb5-4043-bc0f-7515dbeb467f/cinder-api-log/0.log" Dec 16 16:03:48 crc kubenswrapper[4775]: I1216 16:03:48.217001 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_521e4d08-bfb5-4043-bc0f-7515dbeb467f/cinder-api/0.log" Dec 16 16:03:48 crc kubenswrapper[4775]: I1216 16:03:48.368459 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2d7kz" podUID="071fb55f-063a-421f-9431-a806fcc00b6a" containerName="registry-server" containerID="cri-o://5ca5a6d21939a11154ec1b0b2a4240d2e0cb932977d5bbdc5f020f1d736cc9eb" gracePeriod=2 Dec 16 16:03:48 crc kubenswrapper[4775]: I1216 16:03:48.405321 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fc19d8c3-9cbc-45db-ad19-ab8a38792218/cinder-scheduler/0.log" Dec 16 16:03:48 crc kubenswrapper[4775]: I1216 16:03:48.482599 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fc19d8c3-9cbc-45db-ad19-ab8a38792218/probe/0.log" Dec 16 16:03:48 crc kubenswrapper[4775]: I1216 16:03:48.489659 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-76qtc_14cad095-639f-4735-8e83-d5a2abd771c3/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:03:48 crc kubenswrapper[4775]: I1216 16:03:48.680270 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-hsfk7_410c8945-6eac-4dd6-943b-a2024de59d58/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:03:48 crc kubenswrapper[4775]: I1216 16:03:48.784448 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-f8mml_cb5b019a-c088-4515-91e1-a110d1ee04c9/init/0.log" Dec 16 16:03:48 crc kubenswrapper[4775]: I1216 16:03:48.870236 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2d7kz" Dec 16 16:03:48 crc kubenswrapper[4775]: I1216 16:03:48.932084 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9qk6\" (UniqueName: \"kubernetes.io/projected/071fb55f-063a-421f-9431-a806fcc00b6a-kube-api-access-x9qk6\") pod \"071fb55f-063a-421f-9431-a806fcc00b6a\" (UID: \"071fb55f-063a-421f-9431-a806fcc00b6a\") " Dec 16 16:03:48 crc kubenswrapper[4775]: I1216 16:03:48.932338 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/071fb55f-063a-421f-9431-a806fcc00b6a-catalog-content\") pod \"071fb55f-063a-421f-9431-a806fcc00b6a\" (UID: \"071fb55f-063a-421f-9431-a806fcc00b6a\") " Dec 16 16:03:48 crc kubenswrapper[4775]: I1216 16:03:48.932366 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/071fb55f-063a-421f-9431-a806fcc00b6a-utilities\") pod \"071fb55f-063a-421f-9431-a806fcc00b6a\" (UID: \"071fb55f-063a-421f-9431-a806fcc00b6a\") " Dec 16 16:03:48 crc kubenswrapper[4775]: I1216 16:03:48.933222 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/071fb55f-063a-421f-9431-a806fcc00b6a-utilities" (OuterVolumeSpecName: "utilities") pod "071fb55f-063a-421f-9431-a806fcc00b6a" (UID: "071fb55f-063a-421f-9431-a806fcc00b6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:03:48 crc kubenswrapper[4775]: I1216 16:03:48.945031 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-f8mml_cb5b019a-c088-4515-91e1-a110d1ee04c9/init/0.log" Dec 16 16:03:48 crc kubenswrapper[4775]: I1216 16:03:48.948621 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/071fb55f-063a-421f-9431-a806fcc00b6a-kube-api-access-x9qk6" (OuterVolumeSpecName: "kube-api-access-x9qk6") pod "071fb55f-063a-421f-9431-a806fcc00b6a" (UID: "071fb55f-063a-421f-9431-a806fcc00b6a"). InnerVolumeSpecName "kube-api-access-x9qk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:03:48 crc kubenswrapper[4775]: I1216 16:03:48.962917 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/071fb55f-063a-421f-9431-a806fcc00b6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "071fb55f-063a-421f-9431-a806fcc00b6a" (UID: "071fb55f-063a-421f-9431-a806fcc00b6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.021319 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-f8mml_cb5b019a-c088-4515-91e1-a110d1ee04c9/dnsmasq-dns/0.log" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.034928 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/071fb55f-063a-421f-9431-a806fcc00b6a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.034973 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/071fb55f-063a-421f-9431-a806fcc00b6a-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.034989 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9qk6\" (UniqueName: \"kubernetes.io/projected/071fb55f-063a-421f-9431-a806fcc00b6a-kube-api-access-x9qk6\") on node \"crc\" DevicePath \"\"" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.081015 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-cq6zp_32966f09-4e16-4fcb-925e-edb1c957cea1/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.205555 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7732fee4-0518-41db-be31-b9c7ae4aca6b/glance-httpd/0.log" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.273354 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7732fee4-0518-41db-be31-b9c7ae4aca6b/glance-log/0.log" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.375106 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0e61daa0-8bea-4632-8936-5fb68d555ab1/glance-httpd/0.log" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.379532 4775 generic.go:334] "Generic (PLEG): container finished" podID="071fb55f-063a-421f-9431-a806fcc00b6a" containerID="5ca5a6d21939a11154ec1b0b2a4240d2e0cb932977d5bbdc5f020f1d736cc9eb" exitCode=0 Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.379573 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d7kz" event={"ID":"071fb55f-063a-421f-9431-a806fcc00b6a","Type":"ContainerDied","Data":"5ca5a6d21939a11154ec1b0b2a4240d2e0cb932977d5bbdc5f020f1d736cc9eb"} Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.379600 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d7kz" event={"ID":"071fb55f-063a-421f-9431-a806fcc00b6a","Type":"ContainerDied","Data":"6d98ae62a94733d1a2cdbf891877c4209e0d29936896669ff77785d3a9cc2ff2"} Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.379618 4775 scope.go:117] "RemoveContainer" containerID="5ca5a6d21939a11154ec1b0b2a4240d2e0cb932977d5bbdc5f020f1d736cc9eb" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.379737 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2d7kz" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.408547 4775 scope.go:117] "RemoveContainer" containerID="924857a14eda5f592055e638d9e7eaadc3d33b870888ce4e7969061feaaa22b2" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.409188 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2d7kz"] Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.435214 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0e61daa0-8bea-4632-8936-5fb68d555ab1/glance-log/0.log" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.443765 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2d7kz"] Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.447968 4775 scope.go:117] "RemoveContainer" containerID="f59618084b9e605bc165d219885091b20b9832de193c37006f3c999d1e0180a4" Dec 16 16:03:49 crc kubenswrapper[4775]: E1216 16:03:49.467274 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod071fb55f_063a_421f_9431_a806fcc00b6a.slice\": RecentStats: unable to find data in memory cache]" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.479346 4775 scope.go:117] "RemoveContainer" containerID="5ca5a6d21939a11154ec1b0b2a4240d2e0cb932977d5bbdc5f020f1d736cc9eb" Dec 16 16:03:49 crc kubenswrapper[4775]: E1216 16:03:49.481928 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ca5a6d21939a11154ec1b0b2a4240d2e0cb932977d5bbdc5f020f1d736cc9eb\": container with ID starting with 5ca5a6d21939a11154ec1b0b2a4240d2e0cb932977d5bbdc5f020f1d736cc9eb not found: ID does not exist" containerID="5ca5a6d21939a11154ec1b0b2a4240d2e0cb932977d5bbdc5f020f1d736cc9eb" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.481974 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ca5a6d21939a11154ec1b0b2a4240d2e0cb932977d5bbdc5f020f1d736cc9eb"} err="failed to get container status \"5ca5a6d21939a11154ec1b0b2a4240d2e0cb932977d5bbdc5f020f1d736cc9eb\": rpc error: code = NotFound desc = could not find container \"5ca5a6d21939a11154ec1b0b2a4240d2e0cb932977d5bbdc5f020f1d736cc9eb\": container with ID starting with 5ca5a6d21939a11154ec1b0b2a4240d2e0cb932977d5bbdc5f020f1d736cc9eb not found: ID does not exist" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.482003 4775 scope.go:117] "RemoveContainer" containerID="924857a14eda5f592055e638d9e7eaadc3d33b870888ce4e7969061feaaa22b2" Dec 16 16:03:49 crc kubenswrapper[4775]: E1216 16:03:49.482324 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"924857a14eda5f592055e638d9e7eaadc3d33b870888ce4e7969061feaaa22b2\": container with ID starting with 924857a14eda5f592055e638d9e7eaadc3d33b870888ce4e7969061feaaa22b2 not found: ID does not exist" containerID="924857a14eda5f592055e638d9e7eaadc3d33b870888ce4e7969061feaaa22b2" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.482384 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"924857a14eda5f592055e638d9e7eaadc3d33b870888ce4e7969061feaaa22b2"} err="failed to get container status \"924857a14eda5f592055e638d9e7eaadc3d33b870888ce4e7969061feaaa22b2\": rpc error: code = NotFound desc = could not find container \"924857a14eda5f592055e638d9e7eaadc3d33b870888ce4e7969061feaaa22b2\": container with ID starting with 924857a14eda5f592055e638d9e7eaadc3d33b870888ce4e7969061feaaa22b2 not found: ID does not exist" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.482406 4775 scope.go:117] "RemoveContainer" containerID="f59618084b9e605bc165d219885091b20b9832de193c37006f3c999d1e0180a4" Dec 16 16:03:49 crc kubenswrapper[4775]: E1216 16:03:49.483758 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f59618084b9e605bc165d219885091b20b9832de193c37006f3c999d1e0180a4\": container with ID starting with f59618084b9e605bc165d219885091b20b9832de193c37006f3c999d1e0180a4 not found: ID does not exist" containerID="f59618084b9e605bc165d219885091b20b9832de193c37006f3c999d1e0180a4" Dec 16 16:03:49 crc kubenswrapper[4775]: I1216 16:03:49.483787 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f59618084b9e605bc165d219885091b20b9832de193c37006f3c999d1e0180a4"} err="failed to get container status \"f59618084b9e605bc165d219885091b20b9832de193c37006f3c999d1e0180a4\": rpc error: code = NotFound desc = could not find container \"f59618084b9e605bc165d219885091b20b9832de193c37006f3c999d1e0180a4\": container with ID starting with f59618084b9e605bc165d219885091b20b9832de193c37006f3c999d1e0180a4 not found: ID does not exist" Dec 16 16:03:50 crc kubenswrapper[4775]: I1216 16:03:50.112368 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-59fcc7f56d-krpcl_262d5cc2-3677-4f62-aa93-60ccab4cf899/heat-engine/0.log" Dec 16 16:03:50 crc kubenswrapper[4775]: I1216 16:03:50.215994 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-69676fb7c9-tmm27_d9a8d05d-1353-46db-9367-c7205a7d39d9/heat-cfnapi/0.log" Dec 16 16:03:50 crc kubenswrapper[4775]: I1216 16:03:50.255301 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-mxktb_5f07dc8f-f161-4826-b191-4344f1b741e0/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:03:50 crc kubenswrapper[4775]: I1216 16:03:50.317716 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-55844f6789-qwjbq_ea336475-3963-43eb-9e16-814d0c717625/heat-api/0.log" Dec 16 16:03:50 crc kubenswrapper[4775]: I1216 16:03:50.424766 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-lv5v6_c99310e2-070c-4bed-b14d-850dfd069353/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:03:50 crc kubenswrapper[4775]: I1216 16:03:50.628006 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29431681-lc6pz_1e7d3155-afab-4f73-98e3-f2ea11f36050/keystone-cron/0.log" Dec 16 16:03:50 crc kubenswrapper[4775]: I1216 16:03:50.804907 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_44120d84-ab08-40cb-ad82-59518b6f55b2/kube-state-metrics/0.log" Dec 16 16:03:50 crc kubenswrapper[4775]: I1216 16:03:50.836705 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-779ff79b57-nb7bt_2e88e1b8-8837-49a6-9769-ddab7adfb812/keystone-api/0.log" Dec 16 16:03:50 crc kubenswrapper[4775]: I1216 16:03:50.955932 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-kv2q9_0570786e-5fec-43cf-b7ec-12a4facea06d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:03:51 crc kubenswrapper[4775]: I1216 16:03:51.351952 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="071fb55f-063a-421f-9431-a806fcc00b6a" path="/var/lib/kubelet/pods/071fb55f-063a-421f-9431-a806fcc00b6a/volumes" Dec 16 16:03:51 crc kubenswrapper[4775]: I1216 16:03:51.356828 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-69cbb5df9f-wmvhj_66ed3d02-9c61-43a8-90bf-35d00458d088/neutron-api/0.log" Dec 16 16:03:51 crc kubenswrapper[4775]: I1216 16:03:51.444912 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-gkqzh_9a992ef8-ad46-4e3a-a98a-dc75ad484c7e/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:03:51 crc kubenswrapper[4775]: I1216 16:03:51.466533 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-69cbb5df9f-wmvhj_66ed3d02-9c61-43a8-90bf-35d00458d088/neutron-httpd/0.log" Dec 16 16:03:52 crc kubenswrapper[4775]: I1216 16:03:52.391608 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_af50c3ce-5c89-46eb-bd8c-83346b17ad3d/nova-cell0-conductor-conductor/0.log" Dec 16 16:03:52 crc kubenswrapper[4775]: I1216 16:03:52.691372 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b19dccbe-2434-48ae-8822-1ced3b7167c7/nova-cell1-conductor-conductor/0.log" Dec 16 16:03:52 crc kubenswrapper[4775]: I1216 16:03:52.811711 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_22a00983-b0df-4afb-bbc2-2f7da7c8c05e/nova-api-log/0.log" Dec 16 16:03:53 crc kubenswrapper[4775]: I1216 16:03:53.020240 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_2451285e-33a6-42ca-b8f9-336131211c7b/nova-cell1-novncproxy-novncproxy/0.log" Dec 16 16:03:53 crc kubenswrapper[4775]: I1216 16:03:53.090843 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-nnz9f_e3ac9c58-f9b2-4b76-baec-dc50c94c8185/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:03:53 crc kubenswrapper[4775]: I1216 16:03:53.127470 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_22a00983-b0df-4afb-bbc2-2f7da7c8c05e/nova-api-api/0.log" Dec 16 16:03:53 crc kubenswrapper[4775]: I1216 16:03:53.276780 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9e94b8ac-6213-42f6-94ff-7e42e358fcf9/nova-metadata-log/0.log" Dec 16 16:03:53 crc kubenswrapper[4775]: I1216 16:03:53.641952 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f991d67b-2c42-4f93-aacb-3486ea1e43a8/nova-scheduler-scheduler/0.log" Dec 16 16:03:54 crc kubenswrapper[4775]: I1216 16:03:54.087389 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e109503b-1619-4659-956c-24c58c0011a6/mysql-bootstrap/0.log" Dec 16 16:03:54 crc kubenswrapper[4775]: I1216 16:03:54.167645 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e109503b-1619-4659-956c-24c58c0011a6/mysql-bootstrap/0.log" Dec 16 16:03:54 crc kubenswrapper[4775]: I1216 16:03:54.356789 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e109503b-1619-4659-956c-24c58c0011a6/galera/0.log" Dec 16 16:03:54 crc kubenswrapper[4775]: I1216 16:03:54.430056 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2baed48f-c5f4-4126-b0ed-403a38b18c00/mysql-bootstrap/0.log" Dec 16 16:03:54 crc kubenswrapper[4775]: I1216 16:03:54.616297 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2baed48f-c5f4-4126-b0ed-403a38b18c00/mysql-bootstrap/0.log" Dec 16 16:03:54 crc kubenswrapper[4775]: I1216 16:03:54.627125 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2baed48f-c5f4-4126-b0ed-403a38b18c00/galera/0.log" Dec 16 16:03:54 crc kubenswrapper[4775]: I1216 16:03:54.635589 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9e94b8ac-6213-42f6-94ff-7e42e358fcf9/nova-metadata-metadata/0.log" Dec 16 16:03:54 crc kubenswrapper[4775]: I1216 16:03:54.844268 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a9f81b8a-3b7e-4984-946f-2de17873b97a/openstackclient/0.log" Dec 16 16:03:54 crc kubenswrapper[4775]: I1216 16:03:54.881476 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nwdtm_096c5279-0aa8-4641-8b5f-66e41869ec98/openstack-network-exporter/0.log" Dec 16 16:03:55 crc kubenswrapper[4775]: I1216 16:03:55.048473 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-c5f9m_a42d9c48-0f56-4f2d-8c54-8baebeca09ea/ovsdb-server-init/0.log" Dec 16 16:03:55 crc kubenswrapper[4775]: I1216 16:03:55.213749 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-c5f9m_a42d9c48-0f56-4f2d-8c54-8baebeca09ea/ovsdb-server/0.log" Dec 16 16:03:55 crc kubenswrapper[4775]: I1216 16:03:55.226718 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-c5f9m_a42d9c48-0f56-4f2d-8c54-8baebeca09ea/ovsdb-server-init/0.log" Dec 16 16:03:55 crc kubenswrapper[4775]: I1216 16:03:55.281028 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-c5f9m_a42d9c48-0f56-4f2d-8c54-8baebeca09ea/ovs-vswitchd/0.log" Dec 16 16:03:55 crc kubenswrapper[4775]: I1216 16:03:55.472674 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-rkmmt_b560f177-aa8d-4722-92bd-4ef2755caab0/ovn-controller/0.log" Dec 16 16:03:55 crc kubenswrapper[4775]: I1216 16:03:55.557841 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fdz4b_0ac16fe7-1c6b-49c8-a9d2-97db6fa4dc36/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:03:55 crc kubenswrapper[4775]: I1216 16:03:55.672095 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_470f973b-96da-437e-a5ce-e53dbadd9276/openstack-network-exporter/0.log" Dec 16 16:03:55 crc kubenswrapper[4775]: I1216 16:03:55.754976 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_470f973b-96da-437e-a5ce-e53dbadd9276/ovn-northd/0.log" Dec 16 16:03:55 crc kubenswrapper[4775]: I1216 16:03:55.862183 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8b7b212f-2aa6-4fc0-a864-6cd8f1943b71/openstack-network-exporter/0.log" Dec 16 16:03:55 crc kubenswrapper[4775]: I1216 16:03:55.956445 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8b7b212f-2aa6-4fc0-a864-6cd8f1943b71/ovsdbserver-nb/0.log" Dec 16 16:03:55 crc kubenswrapper[4775]: I1216 16:03:55.997308 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e5650e0a-bb07-4cce-872c-772038c2ae56/openstack-network-exporter/0.log" Dec 16 16:03:56 crc kubenswrapper[4775]: I1216 16:03:56.106309 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e5650e0a-bb07-4cce-872c-772038c2ae56/ovsdbserver-sb/0.log" Dec 16 16:03:56 crc kubenswrapper[4775]: I1216 16:03:56.277058 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-c66959c54-2xm6x_c1a21b6b-9081-4060-8bc8-566c2a60bde6/placement-api/0.log" Dec 16 16:03:56 crc kubenswrapper[4775]: I1216 16:03:56.354046 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-c66959c54-2xm6x_c1a21b6b-9081-4060-8bc8-566c2a60bde6/placement-log/0.log" Dec 16 16:03:56 crc kubenswrapper[4775]: I1216 16:03:56.460945 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cf004dca-5d2e-4e4d-9c29-66b076fcc406/setup-container/0.log" Dec 16 16:03:56 crc kubenswrapper[4775]: I1216 16:03:56.677859 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cf004dca-5d2e-4e4d-9c29-66b076fcc406/rabbitmq/0.log" Dec 16 16:03:56 crc kubenswrapper[4775]: I1216 16:03:56.709577 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ba99f865-7192-4da9-8575-62d54a66d82e/setup-container/0.log" Dec 16 16:03:56 crc kubenswrapper[4775]: I1216 16:03:56.747130 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cf004dca-5d2e-4e4d-9c29-66b076fcc406/setup-container/0.log" Dec 16 16:03:56 crc kubenswrapper[4775]: I1216 16:03:56.892073 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ba99f865-7192-4da9-8575-62d54a66d82e/setup-container/0.log" Dec 16 16:03:56 crc kubenswrapper[4775]: I1216 16:03:56.986379 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ba99f865-7192-4da9-8575-62d54a66d82e/rabbitmq/0.log" Dec 16 16:03:57 crc kubenswrapper[4775]: I1216 16:03:57.024536 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-jqb6f_dab0db60-d31f-4e9d-b17e-5dea1fdc90cb/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:03:57 crc kubenswrapper[4775]: I1216 16:03:57.172757 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-r5l7p_586695ef-512d-4d00-b127-751849932aef/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:03:57 crc kubenswrapper[4775]: I1216 16:03:57.238409 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-mhtj5_023c8812-4f2e-4b64-85c7-eabd4ed3d7f9/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:03:57 crc kubenswrapper[4775]: I1216 16:03:57.382815 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-tnb9k_e0ba352c-17c3-4c36-b409-83485c265668/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:03:57 crc kubenswrapper[4775]: I1216 16:03:57.469377 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-jf2hn_67d72872-cd76-413f-bcbe-e0c6da3a8f5a/ssh-known-hosts-edpm-deployment/0.log" Dec 16 16:03:57 crc kubenswrapper[4775]: I1216 16:03:57.721797 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-78c6fdf4b7-xxfgx_15f3da25-9cb6-406e-b022-935c6201ea4a/proxy-server/0.log" Dec 16 16:03:57 crc kubenswrapper[4775]: I1216 16:03:57.855018 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-tp2tw_edd66213-7818-408d-a6ec-73c6e3b39321/swift-ring-rebalance/0.log" Dec 16 16:03:57 crc kubenswrapper[4775]: I1216 16:03:57.857027 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-78c6fdf4b7-xxfgx_15f3da25-9cb6-406e-b022-935c6201ea4a/proxy-httpd/0.log" Dec 16 16:03:58 crc kubenswrapper[4775]: I1216 16:03:58.014252 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/account-auditor/0.log" Dec 16 16:03:58 crc kubenswrapper[4775]: I1216 16:03:58.041408 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/account-reaper/0.log" Dec 16 16:03:58 crc kubenswrapper[4775]: I1216 16:03:58.110573 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/account-replicator/0.log" Dec 16 16:03:58 crc kubenswrapper[4775]: I1216 16:03:58.201496 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/account-server/0.log" Dec 16 16:03:58 crc kubenswrapper[4775]: I1216 16:03:58.234720 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/container-auditor/0.log" Dec 16 16:03:58 crc kubenswrapper[4775]: I1216 16:03:58.279736 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/container-replicator/0.log" Dec 16 16:03:58 crc kubenswrapper[4775]: I1216 16:03:58.326723 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/container-server/0.log" Dec 16 16:03:58 crc kubenswrapper[4775]: I1216 16:03:58.402536 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/container-updater/0.log" Dec 16 16:03:58 crc kubenswrapper[4775]: I1216 16:03:58.456799 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/object-auditor/0.log" Dec 16 16:03:58 crc kubenswrapper[4775]: I1216 16:03:58.545153 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/object-expirer/0.log" Dec 16 16:03:58 crc kubenswrapper[4775]: I1216 16:03:58.586329 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/object-replicator/0.log" Dec 16 16:03:58 crc kubenswrapper[4775]: I1216 16:03:58.642933 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/object-server/0.log" Dec 16 16:03:58 crc kubenswrapper[4775]: I1216 16:03:58.712994 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/object-updater/0.log" Dec 16 16:03:58 crc kubenswrapper[4775]: I1216 16:03:58.768511 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/rsync/0.log" Dec 16 16:03:58 crc kubenswrapper[4775]: I1216 16:03:58.801940 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8b23fde4-e483-4825-969c-94ebc8396511/swift-recon-cron/0.log" Dec 16 16:03:59 crc kubenswrapper[4775]: I1216 16:03:59.003398 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-9n929_533cc620-42ce-4262-bcfe-25c8ebe74ff6/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:03:59 crc kubenswrapper[4775]: I1216 16:03:59.096121 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_81e92dde-6675-4a19-a619-52358e91c49c/tempest-tests-tempest-tests-runner/0.log" Dec 16 16:03:59 crc kubenswrapper[4775]: I1216 16:03:59.202247 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_974b82d4-0fe0-449c-89d3-619ac869f974/test-operator-logs-container/0.log" Dec 16 16:03:59 crc kubenswrapper[4775]: I1216 16:03:59.330349 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-lfpb4_55b74b45-197a-47f8-88cf-ce675418f3ca/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:04:08 crc kubenswrapper[4775]: I1216 16:04:08.349720 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f8bdb272-4c39-4532-926a-f3dcc70af374/memcached/0.log" Dec 16 16:04:26 crc kubenswrapper[4775]: I1216 16:04:26.890905 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-95949466-9m4t8_03a9286d-3fd3-4ec6-9a1d-fb8d613f401e/manager/0.log" Dec 16 16:04:26 crc kubenswrapper[4775]: I1216 16:04:26.999636 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt_16deb1c1-d3c3-46d3-b565-30ef1773f202/util/0.log" Dec 16 16:04:27 crc kubenswrapper[4775]: I1216 16:04:27.200920 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt_16deb1c1-d3c3-46d3-b565-30ef1773f202/util/0.log" Dec 16 16:04:27 crc kubenswrapper[4775]: I1216 16:04:27.203300 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt_16deb1c1-d3c3-46d3-b565-30ef1773f202/pull/0.log" Dec 16 16:04:27 crc kubenswrapper[4775]: I1216 16:04:27.206791 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt_16deb1c1-d3c3-46d3-b565-30ef1773f202/pull/0.log" Dec 16 16:04:27 crc kubenswrapper[4775]: I1216 16:04:27.380823 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt_16deb1c1-d3c3-46d3-b565-30ef1773f202/util/0.log" Dec 16 16:04:27 crc kubenswrapper[4775]: I1216 16:04:27.392875 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt_16deb1c1-d3c3-46d3-b565-30ef1773f202/pull/0.log" Dec 16 16:04:27 crc kubenswrapper[4775]: I1216 16:04:27.411923 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cbfec834b92b78cadff7abcd0973214c865864cd7e11ae0c8c70c04d8b7xwvt_16deb1c1-d3c3-46d3-b565-30ef1773f202/extract/0.log" Dec 16 16:04:27 crc kubenswrapper[4775]: I1216 16:04:27.603375 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66f8b87655-dfdrn_d0dab2aa-577b-4a9d-bcce-0530cbb3e4b6/manager/0.log" Dec 16 16:04:27 crc kubenswrapper[4775]: I1216 16:04:27.623025 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5f98b4754f-5gxtk_e002ee65-47de-44d4-864e-531283c322f7/manager/0.log" Dec 16 16:04:27 crc kubenswrapper[4775]: I1216 16:04:27.861360 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-767f9d7567-dqpmk_8e002a19-f4ca-4186-940c-321834e88e5e/manager/0.log" Dec 16 16:04:27 crc kubenswrapper[4775]: I1216 16:04:27.949534 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5559d9665f-4hmbr_c5962fcc-3c3b-435a-b848-237af19ce258/manager/0.log" Dec 16 16:04:28 crc kubenswrapper[4775]: I1216 16:04:28.602536 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6ccf486b9-pk5fg_a738c781-0876-490f-bf95-d7d77a6f2aff/manager/0.log" Dec 16 16:04:28 crc kubenswrapper[4775]: I1216 16:04:28.806413 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-f458558d7-9b8tb_d8873d69-8f0e-4816-b39e-bf8506282196/manager/0.log" Dec 16 16:04:28 crc kubenswrapper[4775]: I1216 16:04:28.884211 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6558fdd56c-jc4nj_be824423-7753-4920-8aa7-93d2904280fb/manager/0.log" Dec 16 16:04:29 crc kubenswrapper[4775]: I1216 16:04:29.218346 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5c7cbf548f-lph76_eff249fe-7aa9-406b-a4f0-91d7891afc8b/manager/0.log" Dec 16 16:04:29 crc kubenswrapper[4775]: I1216 16:04:29.320719 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5fdd9786f7-gcfkm_f82f14a2-7460-4a06-978b-d22d9ad7d6bd/manager/0.log" Dec 16 16:04:29 crc kubenswrapper[4775]: I1216 16:04:29.406381 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f76f4954c-47s9s_4fbf17e0-d42f-463b-9f01-a39d842812ff/manager/0.log" Dec 16 16:04:30 crc kubenswrapper[4775]: I1216 16:04:30.107290 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7cd87b778f-qmqgx_1723eb19-5ef2-43d0-a1f8-590e89eb5f87/manager/0.log" Dec 16 16:04:30 crc kubenswrapper[4775]: I1216 16:04:30.177666 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5fbbf8b6cc-nvb99_63c035e4-8ff2-49a4-94d9-57c65a71494b/manager/0.log" Dec 16 16:04:30 crc kubenswrapper[4775]: I1216 16:04:30.337906 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7b67c7f6c58tbhb_275767d8-4eed-4a90-8d43-348c607ee37e/manager/0.log" Dec 16 16:04:30 crc kubenswrapper[4775]: I1216 16:04:30.338455 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-68c649d9d-nj5th_11012716-6e3c-4b17-97c7-16e723ad1092/manager/0.log" Dec 16 16:04:30 crc kubenswrapper[4775]: I1216 16:04:30.683669 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-nxcc2_16d64d82-cfc5-461a-a39a-48fd77562a54/registry-server/0.log" Dec 16 16:04:30 crc kubenswrapper[4775]: I1216 16:04:30.876788 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6d499bd55-lnqxb_480fe07d-8bd7-4879-bb80-ceb5f0baf2cb/operator/0.log" Dec 16 16:04:30 crc kubenswrapper[4775]: I1216 16:04:30.911534 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bf6d4f946-d9rg9_b70a54b3-3bc0-45e4-add9-d47b81371266/manager/0.log" Dec 16 16:04:31 crc kubenswrapper[4775]: I1216 16:04:31.038459 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8665b56d78-7fmnw_19d1c138-c230-44b2-972c-c557693054f5/manager/0.log" Dec 16 16:04:31 crc kubenswrapper[4775]: I1216 16:04:31.153275 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xd48q_d132ccba-b1e9-4f8c-8129-1087a1a672b9/operator/0.log" Dec 16 16:04:31 crc kubenswrapper[4775]: I1216 16:04:31.283741 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5c6df8f9-r8p6v_85cc53cf-83a7-4810-b0fc-7317f9327c09/manager/0.log" Dec 16 16:04:31 crc kubenswrapper[4775]: I1216 16:04:31.438669 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-97d456b9-d2kbz_2f9a8b75-2e17-43ce-be88-dbc6f7ec0cb1/manager/0.log" Dec 16 16:04:31 crc kubenswrapper[4775]: I1216 16:04:31.568279 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-756ccf86c7-4mncx_f05c78d5-d86c-42de-9eee-e8d09204a0b4/manager/0.log" Dec 16 16:04:31 crc kubenswrapper[4775]: I1216 16:04:31.684288 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7bc9b98d8-rvdbc_bd6aff58-984e-4106-acb0-c689f6e31832/manager/0.log" Dec 16 16:04:31 crc kubenswrapper[4775]: I1216 16:04:31.723538 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-55f78b7c4c-bl86c_14102b10-a3ba-4f16-9928-4f41426a435f/manager/0.log" Dec 16 16:04:52 crc kubenswrapper[4775]: I1216 16:04:52.372670 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2tkrk_a492f5f7-b613-4e56-8071-78f8c836e7c3/control-plane-machine-set-operator/0.log" Dec 16 16:04:52 crc kubenswrapper[4775]: I1216 16:04:52.457459 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5fjmz_e1a2834e-159c-47f0-81a8-87d37d89a22a/kube-rbac-proxy/0.log" Dec 16 16:04:52 crc kubenswrapper[4775]: I1216 16:04:52.551279 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5fjmz_e1a2834e-159c-47f0-81a8-87d37d89a22a/machine-api-operator/0.log" Dec 16 16:05:05 crc kubenswrapper[4775]: I1216 16:05:05.228831 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-kh9z9_4fc14e4b-fa58-41f3-b5b4-f27d75e6a294/cert-manager-controller/0.log" Dec 16 16:05:05 crc kubenswrapper[4775]: I1216 16:05:05.466323 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-s2cqf_5aa53da3-90be-4e8d-874f-817fce504026/cert-manager-cainjector/0.log" Dec 16 16:05:05 crc kubenswrapper[4775]: I1216 16:05:05.507714 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-gxr9j_370c0803-3050-431b-82e2-d3d69f5d386f/cert-manager-webhook/0.log" Dec 16 16:05:17 crc kubenswrapper[4775]: I1216 16:05:17.263802 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zbbgs"] Dec 16 16:05:17 crc kubenswrapper[4775]: E1216 16:05:17.265027 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="071fb55f-063a-421f-9431-a806fcc00b6a" containerName="extract-content" Dec 16 16:05:17 crc kubenswrapper[4775]: I1216 16:05:17.265044 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="071fb55f-063a-421f-9431-a806fcc00b6a" containerName="extract-content" Dec 16 16:05:17 crc kubenswrapper[4775]: E1216 16:05:17.265068 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="071fb55f-063a-421f-9431-a806fcc00b6a" containerName="extract-utilities" Dec 16 16:05:17 crc kubenswrapper[4775]: I1216 16:05:17.265075 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="071fb55f-063a-421f-9431-a806fcc00b6a" containerName="extract-utilities" Dec 16 16:05:17 crc kubenswrapper[4775]: E1216 16:05:17.265099 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="071fb55f-063a-421f-9431-a806fcc00b6a" containerName="registry-server" Dec 16 16:05:17 crc kubenswrapper[4775]: I1216 16:05:17.265107 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="071fb55f-063a-421f-9431-a806fcc00b6a" containerName="registry-server" Dec 16 16:05:17 crc kubenswrapper[4775]: I1216 16:05:17.265286 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="071fb55f-063a-421f-9431-a806fcc00b6a" containerName="registry-server" Dec 16 16:05:17 crc kubenswrapper[4775]: I1216 16:05:17.267216 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zbbgs" Dec 16 16:05:17 crc kubenswrapper[4775]: I1216 16:05:17.276855 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zbbgs"] Dec 16 16:05:17 crc kubenswrapper[4775]: I1216 16:05:17.428147 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d642ead4-2aa2-4ac2-a69f-010962fbf570-catalog-content\") pod \"community-operators-zbbgs\" (UID: \"d642ead4-2aa2-4ac2-a69f-010962fbf570\") " pod="openshift-marketplace/community-operators-zbbgs" Dec 16 16:05:17 crc kubenswrapper[4775]: I1216 16:05:17.428218 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d642ead4-2aa2-4ac2-a69f-010962fbf570-utilities\") pod \"community-operators-zbbgs\" (UID: \"d642ead4-2aa2-4ac2-a69f-010962fbf570\") " pod="openshift-marketplace/community-operators-zbbgs" Dec 16 16:05:17 crc kubenswrapper[4775]: I1216 16:05:17.428305 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg85k\" (UniqueName: \"kubernetes.io/projected/d642ead4-2aa2-4ac2-a69f-010962fbf570-kube-api-access-hg85k\") pod \"community-operators-zbbgs\" (UID: \"d642ead4-2aa2-4ac2-a69f-010962fbf570\") " pod="openshift-marketplace/community-operators-zbbgs" Dec 16 16:05:17 crc kubenswrapper[4775]: I1216 16:05:17.530032 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d642ead4-2aa2-4ac2-a69f-010962fbf570-catalog-content\") pod \"community-operators-zbbgs\" (UID: \"d642ead4-2aa2-4ac2-a69f-010962fbf570\") " pod="openshift-marketplace/community-operators-zbbgs" Dec 16 16:05:17 crc kubenswrapper[4775]: I1216 16:05:17.530091 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d642ead4-2aa2-4ac2-a69f-010962fbf570-utilities\") pod \"community-operators-zbbgs\" (UID: \"d642ead4-2aa2-4ac2-a69f-010962fbf570\") " pod="openshift-marketplace/community-operators-zbbgs" Dec 16 16:05:17 crc kubenswrapper[4775]: I1216 16:05:17.530150 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg85k\" (UniqueName: \"kubernetes.io/projected/d642ead4-2aa2-4ac2-a69f-010962fbf570-kube-api-access-hg85k\") pod \"community-operators-zbbgs\" (UID: \"d642ead4-2aa2-4ac2-a69f-010962fbf570\") " pod="openshift-marketplace/community-operators-zbbgs" Dec 16 16:05:17 crc kubenswrapper[4775]: I1216 16:05:17.530534 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d642ead4-2aa2-4ac2-a69f-010962fbf570-catalog-content\") pod \"community-operators-zbbgs\" (UID: \"d642ead4-2aa2-4ac2-a69f-010962fbf570\") " pod="openshift-marketplace/community-operators-zbbgs" Dec 16 16:05:17 crc kubenswrapper[4775]: I1216 16:05:17.530651 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d642ead4-2aa2-4ac2-a69f-010962fbf570-utilities\") pod \"community-operators-zbbgs\" (UID: \"d642ead4-2aa2-4ac2-a69f-010962fbf570\") " pod="openshift-marketplace/community-operators-zbbgs" Dec 16 16:05:17 crc kubenswrapper[4775]: I1216 16:05:17.559937 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg85k\" (UniqueName: \"kubernetes.io/projected/d642ead4-2aa2-4ac2-a69f-010962fbf570-kube-api-access-hg85k\") pod \"community-operators-zbbgs\" (UID: \"d642ead4-2aa2-4ac2-a69f-010962fbf570\") " pod="openshift-marketplace/community-operators-zbbgs" Dec 16 16:05:17 crc kubenswrapper[4775]: I1216 16:05:17.588727 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zbbgs" Dec 16 16:05:18 crc kubenswrapper[4775]: I1216 16:05:18.412991 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zbbgs"] Dec 16 16:05:19 crc kubenswrapper[4775]: I1216 16:05:19.154095 4775 generic.go:334] "Generic (PLEG): container finished" podID="d642ead4-2aa2-4ac2-a69f-010962fbf570" containerID="bf420984ae1fcb790e6fba38c89dd4b305f5061be063c09a463f2d6b8158e3f8" exitCode=0 Dec 16 16:05:19 crc kubenswrapper[4775]: I1216 16:05:19.154337 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbbgs" event={"ID":"d642ead4-2aa2-4ac2-a69f-010962fbf570","Type":"ContainerDied","Data":"bf420984ae1fcb790e6fba38c89dd4b305f5061be063c09a463f2d6b8158e3f8"} Dec 16 16:05:19 crc kubenswrapper[4775]: I1216 16:05:19.154643 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbbgs" event={"ID":"d642ead4-2aa2-4ac2-a69f-010962fbf570","Type":"ContainerStarted","Data":"5d0532c7f1add4b2525e6ffbeb5e7b29986d3e6a35893ff920dcad038aec5908"} Dec 16 16:05:19 crc kubenswrapper[4775]: I1216 16:05:19.654784 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d2h6x"] Dec 16 16:05:19 crc kubenswrapper[4775]: I1216 16:05:19.659042 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2h6x" Dec 16 16:05:19 crc kubenswrapper[4775]: I1216 16:05:19.666951 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d2h6x"] Dec 16 16:05:19 crc kubenswrapper[4775]: I1216 16:05:19.698076 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52f5a98a-8542-4964-8f43-1916b24c9b16-utilities\") pod \"certified-operators-d2h6x\" (UID: \"52f5a98a-8542-4964-8f43-1916b24c9b16\") " pod="openshift-marketplace/certified-operators-d2h6x" Dec 16 16:05:19 crc kubenswrapper[4775]: I1216 16:05:19.698516 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdhzb\" (UniqueName: \"kubernetes.io/projected/52f5a98a-8542-4964-8f43-1916b24c9b16-kube-api-access-qdhzb\") pod \"certified-operators-d2h6x\" (UID: \"52f5a98a-8542-4964-8f43-1916b24c9b16\") " pod="openshift-marketplace/certified-operators-d2h6x" Dec 16 16:05:19 crc kubenswrapper[4775]: I1216 16:05:19.698546 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52f5a98a-8542-4964-8f43-1916b24c9b16-catalog-content\") pod \"certified-operators-d2h6x\" (UID: \"52f5a98a-8542-4964-8f43-1916b24c9b16\") " pod="openshift-marketplace/certified-operators-d2h6x" Dec 16 16:05:19 crc kubenswrapper[4775]: I1216 16:05:19.799609 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdhzb\" (UniqueName: \"kubernetes.io/projected/52f5a98a-8542-4964-8f43-1916b24c9b16-kube-api-access-qdhzb\") pod \"certified-operators-d2h6x\" (UID: \"52f5a98a-8542-4964-8f43-1916b24c9b16\") " pod="openshift-marketplace/certified-operators-d2h6x" Dec 16 16:05:19 crc kubenswrapper[4775]: I1216 16:05:19.799656 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52f5a98a-8542-4964-8f43-1916b24c9b16-catalog-content\") pod \"certified-operators-d2h6x\" (UID: \"52f5a98a-8542-4964-8f43-1916b24c9b16\") " pod="openshift-marketplace/certified-operators-d2h6x" Dec 16 16:05:19 crc kubenswrapper[4775]: I1216 16:05:19.799741 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52f5a98a-8542-4964-8f43-1916b24c9b16-utilities\") pod \"certified-operators-d2h6x\" (UID: \"52f5a98a-8542-4964-8f43-1916b24c9b16\") " pod="openshift-marketplace/certified-operators-d2h6x" Dec 16 16:05:19 crc kubenswrapper[4775]: I1216 16:05:19.800241 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52f5a98a-8542-4964-8f43-1916b24c9b16-utilities\") pod \"certified-operators-d2h6x\" (UID: \"52f5a98a-8542-4964-8f43-1916b24c9b16\") " pod="openshift-marketplace/certified-operators-d2h6x" Dec 16 16:05:19 crc kubenswrapper[4775]: I1216 16:05:19.800315 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52f5a98a-8542-4964-8f43-1916b24c9b16-catalog-content\") pod \"certified-operators-d2h6x\" (UID: \"52f5a98a-8542-4964-8f43-1916b24c9b16\") " pod="openshift-marketplace/certified-operators-d2h6x" Dec 16 16:05:19 crc kubenswrapper[4775]: I1216 16:05:19.832653 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdhzb\" (UniqueName: \"kubernetes.io/projected/52f5a98a-8542-4964-8f43-1916b24c9b16-kube-api-access-qdhzb\") pod \"certified-operators-d2h6x\" (UID: \"52f5a98a-8542-4964-8f43-1916b24c9b16\") " pod="openshift-marketplace/certified-operators-d2h6x" Dec 16 16:05:19 crc kubenswrapper[4775]: I1216 16:05:19.978715 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2h6x" Dec 16 16:05:20 crc kubenswrapper[4775]: I1216 16:05:20.624750 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d2h6x"] Dec 16 16:05:21 crc kubenswrapper[4775]: I1216 16:05:21.179614 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2h6x" event={"ID":"52f5a98a-8542-4964-8f43-1916b24c9b16","Type":"ContainerStarted","Data":"6176b535ecc793f1e9478f213fa657bc9eb73e117f3f5040db8c369aead72bf0"} Dec 16 16:05:21 crc kubenswrapper[4775]: I1216 16:05:21.183844 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbbgs" event={"ID":"d642ead4-2aa2-4ac2-a69f-010962fbf570","Type":"ContainerStarted","Data":"c816865c22f7e98c1caa7c1dd0cf63b34fdee4ab3bbe1c8ea112b2fcd8a6179f"} Dec 16 16:05:21 crc kubenswrapper[4775]: I1216 16:05:21.693191 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6ff7998486-6842d_a471fecb-d3ef-427f-a02c-30a00b513bae/nmstate-console-plugin/0.log" Dec 16 16:05:21 crc kubenswrapper[4775]: I1216 16:05:21.875962 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-sc4rw_5c66735d-0eb0-46a8-b2db-f65158873132/nmstate-handler/0.log" Dec 16 16:05:21 crc kubenswrapper[4775]: I1216 16:05:21.994099 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-7hgfg_6384fd2d-45e1-421e-920f-5555dc0f8a10/kube-rbac-proxy/0.log" Dec 16 16:05:22 crc kubenswrapper[4775]: I1216 16:05:22.053356 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-7hgfg_6384fd2d-45e1-421e-920f-5555dc0f8a10/nmstate-metrics/0.log" Dec 16 16:05:22 crc kubenswrapper[4775]: I1216 16:05:22.198322 4775 generic.go:334] "Generic (PLEG): container finished" podID="52f5a98a-8542-4964-8f43-1916b24c9b16" containerID="a7f4be1fe27e6c2c50e7cdfc86965dc9f770cc9f292f85079955e9b1a7c9713c" exitCode=0 Dec 16 16:05:22 crc kubenswrapper[4775]: I1216 16:05:22.198386 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2h6x" event={"ID":"52f5a98a-8542-4964-8f43-1916b24c9b16","Type":"ContainerDied","Data":"a7f4be1fe27e6c2c50e7cdfc86965dc9f770cc9f292f85079955e9b1a7c9713c"} Dec 16 16:05:22 crc kubenswrapper[4775]: I1216 16:05:22.201563 4775 generic.go:334] "Generic (PLEG): container finished" podID="d642ead4-2aa2-4ac2-a69f-010962fbf570" containerID="c816865c22f7e98c1caa7c1dd0cf63b34fdee4ab3bbe1c8ea112b2fcd8a6179f" exitCode=0 Dec 16 16:05:22 crc kubenswrapper[4775]: I1216 16:05:22.201594 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbbgs" event={"ID":"d642ead4-2aa2-4ac2-a69f-010962fbf570","Type":"ContainerDied","Data":"c816865c22f7e98c1caa7c1dd0cf63b34fdee4ab3bbe1c8ea112b2fcd8a6179f"} Dec 16 16:05:22 crc kubenswrapper[4775]: I1216 16:05:22.373537 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6769fb99d-kkdbg_0318b125-3608-48d1-b19f-8fcad1785fa8/nmstate-operator/0.log" Dec 16 16:05:22 crc kubenswrapper[4775]: I1216 16:05:22.434437 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-f8fb84555-4p9vw_52cbae70-fde7-47d8-a118-799f6fb64f2b/nmstate-webhook/0.log" Dec 16 16:05:23 crc kubenswrapper[4775]: I1216 16:05:23.213097 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbbgs" event={"ID":"d642ead4-2aa2-4ac2-a69f-010962fbf570","Type":"ContainerStarted","Data":"49dac046a0d04ee3eaa173cf676f04bf57d181ba34d98d48dc6892f741976916"} Dec 16 16:05:23 crc kubenswrapper[4775]: I1216 16:05:23.240318 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zbbgs" podStartSLOduration=2.425137731 podStartE2EDuration="6.240292039s" podCreationTimestamp="2025-12-16 16:05:17 +0000 UTC" firstStartedPulling="2025-12-16 16:05:19.156361573 +0000 UTC m=+4244.107440496" lastFinishedPulling="2025-12-16 16:05:22.971515881 +0000 UTC m=+4247.922594804" observedRunningTime="2025-12-16 16:05:23.228760339 +0000 UTC m=+4248.179839272" watchObservedRunningTime="2025-12-16 16:05:23.240292039 +0000 UTC m=+4248.191370972" Dec 16 16:05:27 crc kubenswrapper[4775]: I1216 16:05:27.589253 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zbbgs" Dec 16 16:05:27 crc kubenswrapper[4775]: I1216 16:05:27.589731 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zbbgs" Dec 16 16:05:28 crc kubenswrapper[4775]: I1216 16:05:28.644624 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-zbbgs" podUID="d642ead4-2aa2-4ac2-a69f-010962fbf570" containerName="registry-server" probeResult="failure" output=< Dec 16 16:05:28 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Dec 16 16:05:28 crc kubenswrapper[4775]: > Dec 16 16:05:29 crc kubenswrapper[4775]: I1216 16:05:29.269232 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2h6x" event={"ID":"52f5a98a-8542-4964-8f43-1916b24c9b16","Type":"ContainerStarted","Data":"fd5258949e63bab1cbd37722e5eae2b9411709cc66624e3abd466f767a057461"} Dec 16 16:05:30 crc kubenswrapper[4775]: I1216 16:05:30.279781 4775 generic.go:334] "Generic (PLEG): container finished" podID="52f5a98a-8542-4964-8f43-1916b24c9b16" containerID="fd5258949e63bab1cbd37722e5eae2b9411709cc66624e3abd466f767a057461" exitCode=0 Dec 16 16:05:30 crc kubenswrapper[4775]: I1216 16:05:30.279835 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2h6x" event={"ID":"52f5a98a-8542-4964-8f43-1916b24c9b16","Type":"ContainerDied","Data":"fd5258949e63bab1cbd37722e5eae2b9411709cc66624e3abd466f767a057461"} Dec 16 16:05:31 crc kubenswrapper[4775]: I1216 16:05:31.310703 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2h6x" event={"ID":"52f5a98a-8542-4964-8f43-1916b24c9b16","Type":"ContainerStarted","Data":"2398f449f5a79e25d8967c67de70bd2d719283346638d9904662b65111c556b4"} Dec 16 16:05:31 crc kubenswrapper[4775]: I1216 16:05:31.329205 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d2h6x" podStartSLOduration=3.541149196 podStartE2EDuration="12.329191047s" podCreationTimestamp="2025-12-16 16:05:19 +0000 UTC" firstStartedPulling="2025-12-16 16:05:22.200408531 +0000 UTC m=+4247.151487454" lastFinishedPulling="2025-12-16 16:05:30.988450372 +0000 UTC m=+4255.939529305" observedRunningTime="2025-12-16 16:05:31.325855742 +0000 UTC m=+4256.276934675" watchObservedRunningTime="2025-12-16 16:05:31.329191047 +0000 UTC m=+4256.280269970" Dec 16 16:05:37 crc kubenswrapper[4775]: I1216 16:05:37.648509 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zbbgs" Dec 16 16:05:37 crc kubenswrapper[4775]: I1216 16:05:37.703953 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zbbgs" Dec 16 16:05:37 crc kubenswrapper[4775]: I1216 16:05:37.889007 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zbbgs"] Dec 16 16:05:39 crc kubenswrapper[4775]: I1216 16:05:39.368262 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-khxxp_aa78cbdc-f63e-4010-9bdc-88715f997591/kube-rbac-proxy/0.log" Dec 16 16:05:39 crc kubenswrapper[4775]: I1216 16:05:39.388389 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zbbgs" podUID="d642ead4-2aa2-4ac2-a69f-010962fbf570" containerName="registry-server" containerID="cri-o://49dac046a0d04ee3eaa173cf676f04bf57d181ba34d98d48dc6892f741976916" gracePeriod=2 Dec 16 16:05:39 crc kubenswrapper[4775]: I1216 16:05:39.480499 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-khxxp_aa78cbdc-f63e-4010-9bdc-88715f997591/controller/0.log" Dec 16 16:05:39 crc kubenswrapper[4775]: I1216 16:05:39.979273 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d2h6x" Dec 16 16:05:39 crc kubenswrapper[4775]: I1216 16:05:39.979333 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d2h6x" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.004635 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-frr-files/0.log" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.038565 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d2h6x" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.148555 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-reloader/0.log" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.152378 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-frr-files/0.log" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.236606 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-metrics/0.log" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.267829 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-reloader/0.log" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.422080 4775 generic.go:334] "Generic (PLEG): container finished" podID="d642ead4-2aa2-4ac2-a69f-010962fbf570" containerID="49dac046a0d04ee3eaa173cf676f04bf57d181ba34d98d48dc6892f741976916" exitCode=0 Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.422388 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbbgs" event={"ID":"d642ead4-2aa2-4ac2-a69f-010962fbf570","Type":"ContainerDied","Data":"49dac046a0d04ee3eaa173cf676f04bf57d181ba34d98d48dc6892f741976916"} Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.438215 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-reloader/0.log" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.459569 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-metrics/0.log" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.490565 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d2h6x" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.506148 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-metrics/0.log" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.551507 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-frr-files/0.log" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.567000 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zbbgs" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.691860 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d642ead4-2aa2-4ac2-a69f-010962fbf570-catalog-content\") pod \"d642ead4-2aa2-4ac2-a69f-010962fbf570\" (UID: \"d642ead4-2aa2-4ac2-a69f-010962fbf570\") " Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.693263 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d642ead4-2aa2-4ac2-a69f-010962fbf570-utilities\") pod \"d642ead4-2aa2-4ac2-a69f-010962fbf570\" (UID: \"d642ead4-2aa2-4ac2-a69f-010962fbf570\") " Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.693588 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg85k\" (UniqueName: \"kubernetes.io/projected/d642ead4-2aa2-4ac2-a69f-010962fbf570-kube-api-access-hg85k\") pod \"d642ead4-2aa2-4ac2-a69f-010962fbf570\" (UID: \"d642ead4-2aa2-4ac2-a69f-010962fbf570\") " Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.694098 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d642ead4-2aa2-4ac2-a69f-010962fbf570-utilities" (OuterVolumeSpecName: "utilities") pod "d642ead4-2aa2-4ac2-a69f-010962fbf570" (UID: "d642ead4-2aa2-4ac2-a69f-010962fbf570"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.694508 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d642ead4-2aa2-4ac2-a69f-010962fbf570-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.700099 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d642ead4-2aa2-4ac2-a69f-010962fbf570-kube-api-access-hg85k" (OuterVolumeSpecName: "kube-api-access-hg85k") pod "d642ead4-2aa2-4ac2-a69f-010962fbf570" (UID: "d642ead4-2aa2-4ac2-a69f-010962fbf570"). InnerVolumeSpecName "kube-api-access-hg85k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.742219 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d642ead4-2aa2-4ac2-a69f-010962fbf570-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d642ead4-2aa2-4ac2-a69f-010962fbf570" (UID: "d642ead4-2aa2-4ac2-a69f-010962fbf570"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.774294 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/controller/0.log" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.785571 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-frr-files/0.log" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.785928 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-reloader/0.log" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.795572 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg85k\" (UniqueName: \"kubernetes.io/projected/d642ead4-2aa2-4ac2-a69f-010962fbf570-kube-api-access-hg85k\") on node \"crc\" DevicePath \"\"" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.795608 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d642ead4-2aa2-4ac2-a69f-010962fbf570-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.801406 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/cp-metrics/0.log" Dec 16 16:05:40 crc kubenswrapper[4775]: I1216 16:05:40.957772 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/frr-metrics/0.log" Dec 16 16:05:41 crc kubenswrapper[4775]: I1216 16:05:41.065562 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/kube-rbac-proxy-frr/0.log" Dec 16 16:05:41 crc kubenswrapper[4775]: I1216 16:05:41.170143 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/kube-rbac-proxy/0.log" Dec 16 16:05:41 crc kubenswrapper[4775]: I1216 16:05:41.442034 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbbgs" event={"ID":"d642ead4-2aa2-4ac2-a69f-010962fbf570","Type":"ContainerDied","Data":"5d0532c7f1add4b2525e6ffbeb5e7b29986d3e6a35893ff920dcad038aec5908"} Dec 16 16:05:41 crc kubenswrapper[4775]: I1216 16:05:41.442097 4775 scope.go:117] "RemoveContainer" containerID="49dac046a0d04ee3eaa173cf676f04bf57d181ba34d98d48dc6892f741976916" Dec 16 16:05:41 crc kubenswrapper[4775]: I1216 16:05:41.442277 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zbbgs" Dec 16 16:05:41 crc kubenswrapper[4775]: I1216 16:05:41.472421 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zbbgs"] Dec 16 16:05:41 crc kubenswrapper[4775]: I1216 16:05:41.473341 4775 scope.go:117] "RemoveContainer" containerID="c816865c22f7e98c1caa7c1dd0cf63b34fdee4ab3bbe1c8ea112b2fcd8a6179f" Dec 16 16:05:41 crc kubenswrapper[4775]: I1216 16:05:41.484543 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zbbgs"] Dec 16 16:05:41 crc kubenswrapper[4775]: I1216 16:05:41.524708 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d2h6x"] Dec 16 16:05:41 crc kubenswrapper[4775]: I1216 16:05:41.557061 4775 scope.go:117] "RemoveContainer" containerID="bf420984ae1fcb790e6fba38c89dd4b305f5061be063c09a463f2d6b8158e3f8" Dec 16 16:05:41 crc kubenswrapper[4775]: I1216 16:05:41.895410 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dzt2q"] Dec 16 16:05:41 crc kubenswrapper[4775]: I1216 16:05:41.895798 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dzt2q" podUID="2be58473-7d1b-4c58-a3a7-862cd4846f63" containerName="registry-server" containerID="cri-o://b59fbccccf16cf9ac1c30617552986c59d9ab3fd46ed3d6805f8a04393ac8ff5" gracePeriod=2 Dec 16 16:05:41 crc kubenswrapper[4775]: I1216 16:05:41.925937 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/reloader/0.log" Dec 16 16:05:41 crc kubenswrapper[4775]: I1216 16:05:41.953525 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-6hfpf_b55d23ff-e3e6-460c-8058-0489204c8a4d/frr-k8s-webhook-server/0.log" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.193869 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6dbcb5f69b-g6llk_4ca4d9bd-c3ac-4817-bb00-c5b25ec7b4cd/manager/0.log" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.399473 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzt2q" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.460840 4775 generic.go:334] "Generic (PLEG): container finished" podID="2be58473-7d1b-4c58-a3a7-862cd4846f63" containerID="b59fbccccf16cf9ac1c30617552986c59d9ab3fd46ed3d6805f8a04393ac8ff5" exitCode=0 Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.460949 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzt2q" event={"ID":"2be58473-7d1b-4c58-a3a7-862cd4846f63","Type":"ContainerDied","Data":"b59fbccccf16cf9ac1c30617552986c59d9ab3fd46ed3d6805f8a04393ac8ff5"} Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.460977 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzt2q" event={"ID":"2be58473-7d1b-4c58-a3a7-862cd4846f63","Type":"ContainerDied","Data":"08784e40323c4dd5d7099fafe2e9813c1c907379d83841b889768ef487b9deaf"} Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.460994 4775 scope.go:117] "RemoveContainer" containerID="b59fbccccf16cf9ac1c30617552986c59d9ab3fd46ed3d6805f8a04393ac8ff5" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.461102 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzt2q" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.500941 4775 scope.go:117] "RemoveContainer" containerID="2ebafe6b84e9567482ce792aaeb5efe10452c088ad12d404a028fc83576fee0c" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.509259 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7889b8b87-lgnbs_e79447d0-f855-4f85-a021-0618e819f822/webhook-server/0.log" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.524135 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2be58473-7d1b-4c58-a3a7-862cd4846f63-catalog-content\") pod \"2be58473-7d1b-4c58-a3a7-862cd4846f63\" (UID: \"2be58473-7d1b-4c58-a3a7-862cd4846f63\") " Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.524323 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9lzs\" (UniqueName: \"kubernetes.io/projected/2be58473-7d1b-4c58-a3a7-862cd4846f63-kube-api-access-q9lzs\") pod \"2be58473-7d1b-4c58-a3a7-862cd4846f63\" (UID: \"2be58473-7d1b-4c58-a3a7-862cd4846f63\") " Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.524529 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2be58473-7d1b-4c58-a3a7-862cd4846f63-utilities\") pod \"2be58473-7d1b-4c58-a3a7-862cd4846f63\" (UID: \"2be58473-7d1b-4c58-a3a7-862cd4846f63\") " Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.535261 4775 scope.go:117] "RemoveContainer" containerID="b92763adec0fcd3e8c491aea68f0b51226fa3b03c52458d52081832282ca5d3e" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.537406 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2be58473-7d1b-4c58-a3a7-862cd4846f63-utilities" (OuterVolumeSpecName: "utilities") pod "2be58473-7d1b-4c58-a3a7-862cd4846f63" (UID: "2be58473-7d1b-4c58-a3a7-862cd4846f63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.542287 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2be58473-7d1b-4c58-a3a7-862cd4846f63-kube-api-access-q9lzs" (OuterVolumeSpecName: "kube-api-access-q9lzs") pod "2be58473-7d1b-4c58-a3a7-862cd4846f63" (UID: "2be58473-7d1b-4c58-a3a7-862cd4846f63"). InnerVolumeSpecName "kube-api-access-q9lzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.627323 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2be58473-7d1b-4c58-a3a7-862cd4846f63-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.627359 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9lzs\" (UniqueName: \"kubernetes.io/projected/2be58473-7d1b-4c58-a3a7-862cd4846f63-kube-api-access-q9lzs\") on node \"crc\" DevicePath \"\"" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.634223 4775 scope.go:117] "RemoveContainer" containerID="b59fbccccf16cf9ac1c30617552986c59d9ab3fd46ed3d6805f8a04393ac8ff5" Dec 16 16:05:42 crc kubenswrapper[4775]: E1216 16:05:42.634599 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b59fbccccf16cf9ac1c30617552986c59d9ab3fd46ed3d6805f8a04393ac8ff5\": container with ID starting with b59fbccccf16cf9ac1c30617552986c59d9ab3fd46ed3d6805f8a04393ac8ff5 not found: ID does not exist" containerID="b59fbccccf16cf9ac1c30617552986c59d9ab3fd46ed3d6805f8a04393ac8ff5" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.634640 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b59fbccccf16cf9ac1c30617552986c59d9ab3fd46ed3d6805f8a04393ac8ff5"} err="failed to get container status \"b59fbccccf16cf9ac1c30617552986c59d9ab3fd46ed3d6805f8a04393ac8ff5\": rpc error: code = NotFound desc = could not find container \"b59fbccccf16cf9ac1c30617552986c59d9ab3fd46ed3d6805f8a04393ac8ff5\": container with ID starting with b59fbccccf16cf9ac1c30617552986c59d9ab3fd46ed3d6805f8a04393ac8ff5 not found: ID does not exist" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.634667 4775 scope.go:117] "RemoveContainer" containerID="2ebafe6b84e9567482ce792aaeb5efe10452c088ad12d404a028fc83576fee0c" Dec 16 16:05:42 crc kubenswrapper[4775]: E1216 16:05:42.635005 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ebafe6b84e9567482ce792aaeb5efe10452c088ad12d404a028fc83576fee0c\": container with ID starting with 2ebafe6b84e9567482ce792aaeb5efe10452c088ad12d404a028fc83576fee0c not found: ID does not exist" containerID="2ebafe6b84e9567482ce792aaeb5efe10452c088ad12d404a028fc83576fee0c" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.635035 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ebafe6b84e9567482ce792aaeb5efe10452c088ad12d404a028fc83576fee0c"} err="failed to get container status \"2ebafe6b84e9567482ce792aaeb5efe10452c088ad12d404a028fc83576fee0c\": rpc error: code = NotFound desc = could not find container \"2ebafe6b84e9567482ce792aaeb5efe10452c088ad12d404a028fc83576fee0c\": container with ID starting with 2ebafe6b84e9567482ce792aaeb5efe10452c088ad12d404a028fc83576fee0c not found: ID does not exist" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.635072 4775 scope.go:117] "RemoveContainer" containerID="b92763adec0fcd3e8c491aea68f0b51226fa3b03c52458d52081832282ca5d3e" Dec 16 16:05:42 crc kubenswrapper[4775]: E1216 16:05:42.635485 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b92763adec0fcd3e8c491aea68f0b51226fa3b03c52458d52081832282ca5d3e\": container with ID starting with b92763adec0fcd3e8c491aea68f0b51226fa3b03c52458d52081832282ca5d3e not found: ID does not exist" containerID="b92763adec0fcd3e8c491aea68f0b51226fa3b03c52458d52081832282ca5d3e" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.635530 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b92763adec0fcd3e8c491aea68f0b51226fa3b03c52458d52081832282ca5d3e"} err="failed to get container status \"b92763adec0fcd3e8c491aea68f0b51226fa3b03c52458d52081832282ca5d3e\": rpc error: code = NotFound desc = could not find container \"b92763adec0fcd3e8c491aea68f0b51226fa3b03c52458d52081832282ca5d3e\": container with ID starting with b92763adec0fcd3e8c491aea68f0b51226fa3b03c52458d52081832282ca5d3e not found: ID does not exist" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.635855 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2txt9_d80b883c-02c1-4d56-a369-addb8c7bfdca/frr/0.log" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.648865 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2be58473-7d1b-4c58-a3a7-862cd4846f63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2be58473-7d1b-4c58-a3a7-862cd4846f63" (UID: "2be58473-7d1b-4c58-a3a7-862cd4846f63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.729617 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2be58473-7d1b-4c58-a3a7-862cd4846f63-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.807578 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dzt2q"] Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.814153 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dzt2q"] Dec 16 16:05:42 crc kubenswrapper[4775]: I1216 16:05:42.923293 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hzvbb_2516e125-5678-4a01-8a6b-1f8865b69f77/kube-rbac-proxy/0.log" Dec 16 16:05:43 crc kubenswrapper[4775]: I1216 16:05:43.178835 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hzvbb_2516e125-5678-4a01-8a6b-1f8865b69f77/speaker/0.log" Dec 16 16:05:43 crc kubenswrapper[4775]: I1216 16:05:43.357913 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2be58473-7d1b-4c58-a3a7-862cd4846f63" path="/var/lib/kubelet/pods/2be58473-7d1b-4c58-a3a7-862cd4846f63/volumes" Dec 16 16:05:43 crc kubenswrapper[4775]: I1216 16:05:43.359395 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d642ead4-2aa2-4ac2-a69f-010962fbf570" path="/var/lib/kubelet/pods/d642ead4-2aa2-4ac2-a69f-010962fbf570/volumes" Dec 16 16:05:56 crc kubenswrapper[4775]: I1216 16:05:56.744631 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66_50c7dffe-e977-448f-bcdd-7a68df1cefca/util/0.log" Dec 16 16:05:56 crc kubenswrapper[4775]: I1216 16:05:56.915281 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66_50c7dffe-e977-448f-bcdd-7a68df1cefca/util/0.log" Dec 16 16:05:56 crc kubenswrapper[4775]: I1216 16:05:56.940779 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66_50c7dffe-e977-448f-bcdd-7a68df1cefca/pull/0.log" Dec 16 16:05:57 crc kubenswrapper[4775]: I1216 16:05:57.048755 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66_50c7dffe-e977-448f-bcdd-7a68df1cefca/pull/0.log" Dec 16 16:05:57 crc kubenswrapper[4775]: I1216 16:05:57.249074 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66_50c7dffe-e977-448f-bcdd-7a68df1cefca/pull/0.log" Dec 16 16:05:57 crc kubenswrapper[4775]: I1216 16:05:57.265028 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66_50c7dffe-e977-448f-bcdd-7a68df1cefca/util/0.log" Dec 16 16:05:57 crc kubenswrapper[4775]: I1216 16:05:57.267490 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4lgd66_50c7dffe-e977-448f-bcdd-7a68df1cefca/extract/0.log" Dec 16 16:05:57 crc kubenswrapper[4775]: I1216 16:05:57.432841 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx_739f7090-9a46-4ae3-a85b-045a2b1e197d/util/0.log" Dec 16 16:05:57 crc kubenswrapper[4775]: I1216 16:05:57.590524 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx_739f7090-9a46-4ae3-a85b-045a2b1e197d/pull/0.log" Dec 16 16:05:57 crc kubenswrapper[4775]: I1216 16:05:57.634683 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx_739f7090-9a46-4ae3-a85b-045a2b1e197d/pull/0.log" Dec 16 16:05:57 crc kubenswrapper[4775]: I1216 16:05:57.638108 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx_739f7090-9a46-4ae3-a85b-045a2b1e197d/util/0.log" Dec 16 16:05:57 crc kubenswrapper[4775]: I1216 16:05:57.841119 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx_739f7090-9a46-4ae3-a85b-045a2b1e197d/pull/0.log" Dec 16 16:05:57 crc kubenswrapper[4775]: I1216 16:05:57.867292 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx_739f7090-9a46-4ae3-a85b-045a2b1e197d/extract/0.log" Dec 16 16:05:57 crc kubenswrapper[4775]: I1216 16:05:57.895353 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8n6tzx_739f7090-9a46-4ae3-a85b-045a2b1e197d/util/0.log" Dec 16 16:05:58 crc kubenswrapper[4775]: I1216 16:05:58.047610 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d2h6x_52f5a98a-8542-4964-8f43-1916b24c9b16/extract-utilities/0.log" Dec 16 16:05:58 crc kubenswrapper[4775]: I1216 16:05:58.767627 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d2h6x_52f5a98a-8542-4964-8f43-1916b24c9b16/extract-content/0.log" Dec 16 16:05:58 crc kubenswrapper[4775]: I1216 16:05:58.820173 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d2h6x_52f5a98a-8542-4964-8f43-1916b24c9b16/extract-utilities/0.log" Dec 16 16:05:58 crc kubenswrapper[4775]: I1216 16:05:58.847086 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d2h6x_52f5a98a-8542-4964-8f43-1916b24c9b16/extract-content/0.log" Dec 16 16:05:59 crc kubenswrapper[4775]: I1216 16:05:59.052089 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d2h6x_52f5a98a-8542-4964-8f43-1916b24c9b16/extract-utilities/0.log" Dec 16 16:05:59 crc kubenswrapper[4775]: I1216 16:05:59.108256 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d2h6x_52f5a98a-8542-4964-8f43-1916b24c9b16/registry-server/0.log" Dec 16 16:05:59 crc kubenswrapper[4775]: I1216 16:05:59.110539 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-d2h6x_52f5a98a-8542-4964-8f43-1916b24c9b16/extract-content/0.log" Dec 16 16:05:59 crc kubenswrapper[4775]: I1216 16:05:59.246473 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwsw2_37c67918-469b-4d46-aabb-63b96e941479/extract-utilities/0.log" Dec 16 16:05:59 crc kubenswrapper[4775]: I1216 16:05:59.428470 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwsw2_37c67918-469b-4d46-aabb-63b96e941479/extract-content/0.log" Dec 16 16:05:59 crc kubenswrapper[4775]: I1216 16:05:59.429054 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwsw2_37c67918-469b-4d46-aabb-63b96e941479/extract-utilities/0.log" Dec 16 16:05:59 crc kubenswrapper[4775]: I1216 16:05:59.455087 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwsw2_37c67918-469b-4d46-aabb-63b96e941479/extract-content/0.log" Dec 16 16:05:59 crc kubenswrapper[4775]: I1216 16:05:59.615661 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwsw2_37c67918-469b-4d46-aabb-63b96e941479/extract-content/0.log" Dec 16 16:05:59 crc kubenswrapper[4775]: I1216 16:05:59.635450 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwsw2_37c67918-469b-4d46-aabb-63b96e941479/extract-utilities/0.log" Dec 16 16:05:59 crc kubenswrapper[4775]: I1216 16:05:59.826071 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-76f84_7d608ef1-7f5b-45c5-80ce-f9be86cd93fe/marketplace-operator/0.log" Dec 16 16:05:59 crc kubenswrapper[4775]: I1216 16:05:59.934528 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-67zrm_59df28e1-27a5-451d-9784-a30eba2a3dc0/extract-utilities/0.log" Dec 16 16:06:00 crc kubenswrapper[4775]: I1216 16:06:00.430352 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fwsw2_37c67918-469b-4d46-aabb-63b96e941479/registry-server/0.log" Dec 16 16:06:00 crc kubenswrapper[4775]: I1216 16:06:00.542615 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-67zrm_59df28e1-27a5-451d-9784-a30eba2a3dc0/extract-content/0.log" Dec 16 16:06:00 crc kubenswrapper[4775]: I1216 16:06:00.560810 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-67zrm_59df28e1-27a5-451d-9784-a30eba2a3dc0/extract-utilities/0.log" Dec 16 16:06:00 crc kubenswrapper[4775]: I1216 16:06:00.566995 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-67zrm_59df28e1-27a5-451d-9784-a30eba2a3dc0/extract-content/0.log" Dec 16 16:06:00 crc kubenswrapper[4775]: I1216 16:06:00.708905 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-67zrm_59df28e1-27a5-451d-9784-a30eba2a3dc0/extract-content/0.log" Dec 16 16:06:00 crc kubenswrapper[4775]: I1216 16:06:00.709780 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-67zrm_59df28e1-27a5-451d-9784-a30eba2a3dc0/extract-utilities/0.log" Dec 16 16:06:00 crc kubenswrapper[4775]: I1216 16:06:00.820407 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s28hs_e182acf8-e0f8-4ad4-b91f-0028568a79c3/extract-utilities/0.log" Dec 16 16:06:00 crc kubenswrapper[4775]: I1216 16:06:00.948695 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-67zrm_59df28e1-27a5-451d-9784-a30eba2a3dc0/registry-server/0.log" Dec 16 16:06:00 crc kubenswrapper[4775]: I1216 16:06:00.999602 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s28hs_e182acf8-e0f8-4ad4-b91f-0028568a79c3/extract-content/0.log" Dec 16 16:06:01 crc kubenswrapper[4775]: I1216 16:06:01.030297 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s28hs_e182acf8-e0f8-4ad4-b91f-0028568a79c3/extract-utilities/0.log" Dec 16 16:06:01 crc kubenswrapper[4775]: I1216 16:06:01.044476 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s28hs_e182acf8-e0f8-4ad4-b91f-0028568a79c3/extract-content/0.log" Dec 16 16:06:01 crc kubenswrapper[4775]: I1216 16:06:01.171762 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s28hs_e182acf8-e0f8-4ad4-b91f-0028568a79c3/extract-content/0.log" Dec 16 16:06:01 crc kubenswrapper[4775]: I1216 16:06:01.176487 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s28hs_e182acf8-e0f8-4ad4-b91f-0028568a79c3/extract-utilities/0.log" Dec 16 16:06:01 crc kubenswrapper[4775]: I1216 16:06:01.594703 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s28hs_e182acf8-e0f8-4ad4-b91f-0028568a79c3/registry-server/0.log" Dec 16 16:06:02 crc kubenswrapper[4775]: I1216 16:06:02.869095 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 16:06:02 crc kubenswrapper[4775]: I1216 16:06:02.869446 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 16:06:32 crc kubenswrapper[4775]: I1216 16:06:32.868855 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 16:06:32 crc kubenswrapper[4775]: I1216 16:06:32.869398 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 16:06:35 crc kubenswrapper[4775]: E1216 16:06:35.429879 4775 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.223:50524->38.102.83.223:37315: write tcp 38.102.83.223:50524->38.102.83.223:37315: write: broken pipe Dec 16 16:07:02 crc kubenswrapper[4775]: I1216 16:07:02.871711 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 16:07:02 crc kubenswrapper[4775]: I1216 16:07:02.872340 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 16:07:02 crc kubenswrapper[4775]: I1216 16:07:02.872404 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 16:07:02 crc kubenswrapper[4775]: I1216 16:07:02.874751 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"463c218c74a81c197a6e325e9e6e91831b9764405b5f63ce630e6f2e7837e133"} pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 16:07:02 crc kubenswrapper[4775]: I1216 16:07:02.875088 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" containerID="cri-o://463c218c74a81c197a6e325e9e6e91831b9764405b5f63ce630e6f2e7837e133" gracePeriod=600 Dec 16 16:07:03 crc kubenswrapper[4775]: I1216 16:07:03.168298 4775 generic.go:334] "Generic (PLEG): container finished" podID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerID="463c218c74a81c197a6e325e9e6e91831b9764405b5f63ce630e6f2e7837e133" exitCode=0 Dec 16 16:07:03 crc kubenswrapper[4775]: I1216 16:07:03.168436 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerDied","Data":"463c218c74a81c197a6e325e9e6e91831b9764405b5f63ce630e6f2e7837e133"} Dec 16 16:07:03 crc kubenswrapper[4775]: I1216 16:07:03.168655 4775 scope.go:117] "RemoveContainer" containerID="a6bf506c996a332874f9510ce3ddfe96e932d9fe66997393464c774620c75faf" Dec 16 16:07:04 crc kubenswrapper[4775]: I1216 16:07:04.178459 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerStarted","Data":"d3e60c9c88a8eba7cfccff403f539430af46819167e101c59e7f400f53b3cfd7"} Dec 16 16:07:47 crc kubenswrapper[4775]: I1216 16:07:47.602486 4775 generic.go:334] "Generic (PLEG): container finished" podID="275854a3-0a76-4f6a-9a0a-12605fa0f5b0" containerID="689de27950b1438e5663571c1ee83fde62ec541677dfd8314fcc333f2b83ae53" exitCode=0 Dec 16 16:07:47 crc kubenswrapper[4775]: I1216 16:07:47.602618 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwqkz/must-gather-2sghs" event={"ID":"275854a3-0a76-4f6a-9a0a-12605fa0f5b0","Type":"ContainerDied","Data":"689de27950b1438e5663571c1ee83fde62ec541677dfd8314fcc333f2b83ae53"} Dec 16 16:07:47 crc kubenswrapper[4775]: I1216 16:07:47.604086 4775 scope.go:117] "RemoveContainer" containerID="689de27950b1438e5663571c1ee83fde62ec541677dfd8314fcc333f2b83ae53" Dec 16 16:07:47 crc kubenswrapper[4775]: I1216 16:07:47.682189 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nwqkz_must-gather-2sghs_275854a3-0a76-4f6a-9a0a-12605fa0f5b0/gather/0.log" Dec 16 16:07:57 crc kubenswrapper[4775]: I1216 16:07:57.756147 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nwqkz/must-gather-2sghs"] Dec 16 16:07:57 crc kubenswrapper[4775]: I1216 16:07:57.756973 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-nwqkz/must-gather-2sghs" podUID="275854a3-0a76-4f6a-9a0a-12605fa0f5b0" containerName="copy" containerID="cri-o://21352973595c7377da49851b3f90ba5ad19981ae1c5fbbf6fa393a2e213be8fc" gracePeriod=2 Dec 16 16:07:57 crc kubenswrapper[4775]: I1216 16:07:57.762875 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nwqkz/must-gather-2sghs"] Dec 16 16:07:58 crc kubenswrapper[4775]: I1216 16:07:58.197789 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nwqkz_must-gather-2sghs_275854a3-0a76-4f6a-9a0a-12605fa0f5b0/copy/0.log" Dec 16 16:07:58 crc kubenswrapper[4775]: I1216 16:07:58.198547 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwqkz/must-gather-2sghs" Dec 16 16:07:58 crc kubenswrapper[4775]: I1216 16:07:58.294701 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/275854a3-0a76-4f6a-9a0a-12605fa0f5b0-must-gather-output\") pod \"275854a3-0a76-4f6a-9a0a-12605fa0f5b0\" (UID: \"275854a3-0a76-4f6a-9a0a-12605fa0f5b0\") " Dec 16 16:07:58 crc kubenswrapper[4775]: I1216 16:07:58.294969 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlcwd\" (UniqueName: \"kubernetes.io/projected/275854a3-0a76-4f6a-9a0a-12605fa0f5b0-kube-api-access-dlcwd\") pod \"275854a3-0a76-4f6a-9a0a-12605fa0f5b0\" (UID: \"275854a3-0a76-4f6a-9a0a-12605fa0f5b0\") " Dec 16 16:07:58 crc kubenswrapper[4775]: I1216 16:07:58.304495 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/275854a3-0a76-4f6a-9a0a-12605fa0f5b0-kube-api-access-dlcwd" (OuterVolumeSpecName: "kube-api-access-dlcwd") pod "275854a3-0a76-4f6a-9a0a-12605fa0f5b0" (UID: "275854a3-0a76-4f6a-9a0a-12605fa0f5b0"). InnerVolumeSpecName "kube-api-access-dlcwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:07:58 crc kubenswrapper[4775]: I1216 16:07:58.397525 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlcwd\" (UniqueName: \"kubernetes.io/projected/275854a3-0a76-4f6a-9a0a-12605fa0f5b0-kube-api-access-dlcwd\") on node \"crc\" DevicePath \"\"" Dec 16 16:07:58 crc kubenswrapper[4775]: I1216 16:07:58.434862 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/275854a3-0a76-4f6a-9a0a-12605fa0f5b0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "275854a3-0a76-4f6a-9a0a-12605fa0f5b0" (UID: "275854a3-0a76-4f6a-9a0a-12605fa0f5b0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:07:58 crc kubenswrapper[4775]: I1216 16:07:58.500106 4775 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/275854a3-0a76-4f6a-9a0a-12605fa0f5b0-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 16 16:07:58 crc kubenswrapper[4775]: I1216 16:07:58.714554 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nwqkz_must-gather-2sghs_275854a3-0a76-4f6a-9a0a-12605fa0f5b0/copy/0.log" Dec 16 16:07:58 crc kubenswrapper[4775]: I1216 16:07:58.714966 4775 generic.go:334] "Generic (PLEG): container finished" podID="275854a3-0a76-4f6a-9a0a-12605fa0f5b0" containerID="21352973595c7377da49851b3f90ba5ad19981ae1c5fbbf6fa393a2e213be8fc" exitCode=143 Dec 16 16:07:58 crc kubenswrapper[4775]: I1216 16:07:58.715021 4775 scope.go:117] "RemoveContainer" containerID="21352973595c7377da49851b3f90ba5ad19981ae1c5fbbf6fa393a2e213be8fc" Dec 16 16:07:58 crc kubenswrapper[4775]: I1216 16:07:58.715029 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwqkz/must-gather-2sghs" Dec 16 16:07:58 crc kubenswrapper[4775]: I1216 16:07:58.763120 4775 scope.go:117] "RemoveContainer" containerID="689de27950b1438e5663571c1ee83fde62ec541677dfd8314fcc333f2b83ae53" Dec 16 16:07:58 crc kubenswrapper[4775]: I1216 16:07:58.842201 4775 scope.go:117] "RemoveContainer" containerID="21352973595c7377da49851b3f90ba5ad19981ae1c5fbbf6fa393a2e213be8fc" Dec 16 16:07:58 crc kubenswrapper[4775]: E1216 16:07:58.842969 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21352973595c7377da49851b3f90ba5ad19981ae1c5fbbf6fa393a2e213be8fc\": container with ID starting with 21352973595c7377da49851b3f90ba5ad19981ae1c5fbbf6fa393a2e213be8fc not found: ID does not exist" containerID="21352973595c7377da49851b3f90ba5ad19981ae1c5fbbf6fa393a2e213be8fc" Dec 16 16:07:58 crc kubenswrapper[4775]: I1216 16:07:58.843038 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21352973595c7377da49851b3f90ba5ad19981ae1c5fbbf6fa393a2e213be8fc"} err="failed to get container status \"21352973595c7377da49851b3f90ba5ad19981ae1c5fbbf6fa393a2e213be8fc\": rpc error: code = NotFound desc = could not find container \"21352973595c7377da49851b3f90ba5ad19981ae1c5fbbf6fa393a2e213be8fc\": container with ID starting with 21352973595c7377da49851b3f90ba5ad19981ae1c5fbbf6fa393a2e213be8fc not found: ID does not exist" Dec 16 16:07:58 crc kubenswrapper[4775]: I1216 16:07:58.843076 4775 scope.go:117] "RemoveContainer" containerID="689de27950b1438e5663571c1ee83fde62ec541677dfd8314fcc333f2b83ae53" Dec 16 16:07:58 crc kubenswrapper[4775]: E1216 16:07:58.843454 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"689de27950b1438e5663571c1ee83fde62ec541677dfd8314fcc333f2b83ae53\": container with ID starting with 689de27950b1438e5663571c1ee83fde62ec541677dfd8314fcc333f2b83ae53 not found: ID does not exist" containerID="689de27950b1438e5663571c1ee83fde62ec541677dfd8314fcc333f2b83ae53" Dec 16 16:07:58 crc kubenswrapper[4775]: I1216 16:07:58.843485 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"689de27950b1438e5663571c1ee83fde62ec541677dfd8314fcc333f2b83ae53"} err="failed to get container status \"689de27950b1438e5663571c1ee83fde62ec541677dfd8314fcc333f2b83ae53\": rpc error: code = NotFound desc = could not find container \"689de27950b1438e5663571c1ee83fde62ec541677dfd8314fcc333f2b83ae53\": container with ID starting with 689de27950b1438e5663571c1ee83fde62ec541677dfd8314fcc333f2b83ae53 not found: ID does not exist" Dec 16 16:07:59 crc kubenswrapper[4775]: I1216 16:07:59.348694 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="275854a3-0a76-4f6a-9a0a-12605fa0f5b0" path="/var/lib/kubelet/pods/275854a3-0a76-4f6a-9a0a-12605fa0f5b0/volumes" Dec 16 16:09:32 crc kubenswrapper[4775]: I1216 16:09:32.869992 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 16:09:32 crc kubenswrapper[4775]: I1216 16:09:32.871072 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 16:09:59 crc kubenswrapper[4775]: I1216 16:09:59.405610 4775 scope.go:117] "RemoveContainer" containerID="a4942c5c28915fe4fb72317971898716d6506f88bf6bc88a056e4f0e0231b1ad" Dec 16 16:10:02 crc kubenswrapper[4775]: I1216 16:10:02.869158 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 16:10:02 crc kubenswrapper[4775]: I1216 16:10:02.870830 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 16:10:32 crc kubenswrapper[4775]: I1216 16:10:32.872059 4775 patch_prober.go:28] interesting pod/machine-config-daemon-lh6xh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 16:10:32 crc kubenswrapper[4775]: I1216 16:10:32.872616 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 16:10:32 crc kubenswrapper[4775]: I1216 16:10:32.872663 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" Dec 16 16:10:32 crc kubenswrapper[4775]: I1216 16:10:32.873410 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d3e60c9c88a8eba7cfccff403f539430af46819167e101c59e7f400f53b3cfd7"} pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 16:10:32 crc kubenswrapper[4775]: I1216 16:10:32.873459 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerName="machine-config-daemon" containerID="cri-o://d3e60c9c88a8eba7cfccff403f539430af46819167e101c59e7f400f53b3cfd7" gracePeriod=600 Dec 16 16:10:33 crc kubenswrapper[4775]: I1216 16:10:33.111131 4775 generic.go:334] "Generic (PLEG): container finished" podID="584613dc-ef95-4911-9a79-76e805e1d4d1" containerID="d3e60c9c88a8eba7cfccff403f539430af46819167e101c59e7f400f53b3cfd7" exitCode=0 Dec 16 16:10:33 crc kubenswrapper[4775]: I1216 16:10:33.111236 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" event={"ID":"584613dc-ef95-4911-9a79-76e805e1d4d1","Type":"ContainerDied","Data":"d3e60c9c88a8eba7cfccff403f539430af46819167e101c59e7f400f53b3cfd7"} Dec 16 16:10:33 crc kubenswrapper[4775]: I1216 16:10:33.112155 4775 scope.go:117] "RemoveContainer" containerID="463c218c74a81c197a6e325e9e6e91831b9764405b5f63ce630e6f2e7837e133" Dec 16 16:10:33 crc kubenswrapper[4775]: E1216 16:10:33.605774 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:10:34 crc kubenswrapper[4775]: I1216 16:10:34.121821 4775 scope.go:117] "RemoveContainer" containerID="d3e60c9c88a8eba7cfccff403f539430af46819167e101c59e7f400f53b3cfd7" Dec 16 16:10:34 crc kubenswrapper[4775]: E1216 16:10:34.122171 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:10:48 crc kubenswrapper[4775]: I1216 16:10:48.339539 4775 scope.go:117] "RemoveContainer" containerID="d3e60c9c88a8eba7cfccff403f539430af46819167e101c59e7f400f53b3cfd7" Dec 16 16:10:48 crc kubenswrapper[4775]: E1216 16:10:48.340395 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:10:59 crc kubenswrapper[4775]: I1216 16:10:59.337900 4775 scope.go:117] "RemoveContainer" containerID="d3e60c9c88a8eba7cfccff403f539430af46819167e101c59e7f400f53b3cfd7" Dec 16 16:10:59 crc kubenswrapper[4775]: E1216 16:10:59.338583 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.349934 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mrlw2"] Dec 16 16:11:10 crc kubenswrapper[4775]: E1216 16:11:10.350929 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d642ead4-2aa2-4ac2-a69f-010962fbf570" containerName="extract-content" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.350943 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d642ead4-2aa2-4ac2-a69f-010962fbf570" containerName="extract-content" Dec 16 16:11:10 crc kubenswrapper[4775]: E1216 16:11:10.350957 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be58473-7d1b-4c58-a3a7-862cd4846f63" containerName="extract-content" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.350963 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be58473-7d1b-4c58-a3a7-862cd4846f63" containerName="extract-content" Dec 16 16:11:10 crc kubenswrapper[4775]: E1216 16:11:10.350985 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="275854a3-0a76-4f6a-9a0a-12605fa0f5b0" containerName="gather" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.350991 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="275854a3-0a76-4f6a-9a0a-12605fa0f5b0" containerName="gather" Dec 16 16:11:10 crc kubenswrapper[4775]: E1216 16:11:10.351006 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be58473-7d1b-4c58-a3a7-862cd4846f63" containerName="extract-utilities" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.351012 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be58473-7d1b-4c58-a3a7-862cd4846f63" containerName="extract-utilities" Dec 16 16:11:10 crc kubenswrapper[4775]: E1216 16:11:10.351021 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d642ead4-2aa2-4ac2-a69f-010962fbf570" containerName="extract-utilities" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.351026 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d642ead4-2aa2-4ac2-a69f-010962fbf570" containerName="extract-utilities" Dec 16 16:11:10 crc kubenswrapper[4775]: E1216 16:11:10.351036 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be58473-7d1b-4c58-a3a7-862cd4846f63" containerName="registry-server" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.351041 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be58473-7d1b-4c58-a3a7-862cd4846f63" containerName="registry-server" Dec 16 16:11:10 crc kubenswrapper[4775]: E1216 16:11:10.351054 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d642ead4-2aa2-4ac2-a69f-010962fbf570" containerName="registry-server" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.351060 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d642ead4-2aa2-4ac2-a69f-010962fbf570" containerName="registry-server" Dec 16 16:11:10 crc kubenswrapper[4775]: E1216 16:11:10.351070 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="275854a3-0a76-4f6a-9a0a-12605fa0f5b0" containerName="copy" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.351078 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="275854a3-0a76-4f6a-9a0a-12605fa0f5b0" containerName="copy" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.351289 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="275854a3-0a76-4f6a-9a0a-12605fa0f5b0" containerName="copy" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.351302 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="2be58473-7d1b-4c58-a3a7-862cd4846f63" containerName="registry-server" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.351319 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="275854a3-0a76-4f6a-9a0a-12605fa0f5b0" containerName="gather" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.351336 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d642ead4-2aa2-4ac2-a69f-010962fbf570" containerName="registry-server" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.352936 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrlw2" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.363922 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mrlw2"] Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.489671 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcpfr\" (UniqueName: \"kubernetes.io/projected/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4-kube-api-access-dcpfr\") pod \"redhat-operators-mrlw2\" (UID: \"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4\") " pod="openshift-marketplace/redhat-operators-mrlw2" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.489806 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4-utilities\") pod \"redhat-operators-mrlw2\" (UID: \"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4\") " pod="openshift-marketplace/redhat-operators-mrlw2" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.490651 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4-catalog-content\") pod \"redhat-operators-mrlw2\" (UID: \"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4\") " pod="openshift-marketplace/redhat-operators-mrlw2" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.592252 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4-catalog-content\") pod \"redhat-operators-mrlw2\" (UID: \"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4\") " pod="openshift-marketplace/redhat-operators-mrlw2" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.592377 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcpfr\" (UniqueName: \"kubernetes.io/projected/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4-kube-api-access-dcpfr\") pod \"redhat-operators-mrlw2\" (UID: \"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4\") " pod="openshift-marketplace/redhat-operators-mrlw2" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.592449 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4-utilities\") pod \"redhat-operators-mrlw2\" (UID: \"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4\") " pod="openshift-marketplace/redhat-operators-mrlw2" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.592839 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4-utilities\") pod \"redhat-operators-mrlw2\" (UID: \"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4\") " pod="openshift-marketplace/redhat-operators-mrlw2" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.593069 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4-catalog-content\") pod \"redhat-operators-mrlw2\" (UID: \"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4\") " pod="openshift-marketplace/redhat-operators-mrlw2" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.610580 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcpfr\" (UniqueName: \"kubernetes.io/projected/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4-kube-api-access-dcpfr\") pod \"redhat-operators-mrlw2\" (UID: \"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4\") " pod="openshift-marketplace/redhat-operators-mrlw2" Dec 16 16:11:10 crc kubenswrapper[4775]: I1216 16:11:10.726307 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrlw2" Dec 16 16:11:11 crc kubenswrapper[4775]: I1216 16:11:11.185089 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mrlw2"] Dec 16 16:11:11 crc kubenswrapper[4775]: I1216 16:11:11.337803 4775 scope.go:117] "RemoveContainer" containerID="d3e60c9c88a8eba7cfccff403f539430af46819167e101c59e7f400f53b3cfd7" Dec 16 16:11:11 crc kubenswrapper[4775]: E1216 16:11:11.338090 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:11:11 crc kubenswrapper[4775]: I1216 16:11:11.518494 4775 generic.go:334] "Generic (PLEG): container finished" podID="4da666e0-aaed-4dfd-8ac8-f4007df2fbe4" containerID="6c0e128c999d6896a32568ad6bb9835a6a54b78e2ff506b3fa900312f58222bd" exitCode=0 Dec 16 16:11:11 crc kubenswrapper[4775]: I1216 16:11:11.518554 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrlw2" event={"ID":"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4","Type":"ContainerDied","Data":"6c0e128c999d6896a32568ad6bb9835a6a54b78e2ff506b3fa900312f58222bd"} Dec 16 16:11:11 crc kubenswrapper[4775]: I1216 16:11:11.518582 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrlw2" event={"ID":"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4","Type":"ContainerStarted","Data":"2915d4242c3bd39169e971eadfe5fc48018ec36ba2d73ce66ece4ed5ab78d1f0"} Dec 16 16:11:11 crc kubenswrapper[4775]: I1216 16:11:11.521617 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 16:11:13 crc kubenswrapper[4775]: I1216 16:11:13.536572 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrlw2" event={"ID":"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4","Type":"ContainerStarted","Data":"c90879384c4bac064dba4455271188152d6d8bcddb5b5e29cd89d9d7eeadbf29"} Dec 16 16:11:15 crc kubenswrapper[4775]: I1216 16:11:15.554878 4775 generic.go:334] "Generic (PLEG): container finished" podID="4da666e0-aaed-4dfd-8ac8-f4007df2fbe4" containerID="c90879384c4bac064dba4455271188152d6d8bcddb5b5e29cd89d9d7eeadbf29" exitCode=0 Dec 16 16:11:15 crc kubenswrapper[4775]: I1216 16:11:15.554931 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrlw2" event={"ID":"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4","Type":"ContainerDied","Data":"c90879384c4bac064dba4455271188152d6d8bcddb5b5e29cd89d9d7eeadbf29"} Dec 16 16:11:17 crc kubenswrapper[4775]: I1216 16:11:17.594951 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrlw2" event={"ID":"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4","Type":"ContainerStarted","Data":"b567423689e38a822f3d807c70809d8820b506eb6a96aef857637ff0c4cf05e1"} Dec 16 16:11:17 crc kubenswrapper[4775]: I1216 16:11:17.619766 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mrlw2" podStartSLOduration=2.011900372 podStartE2EDuration="7.619747279s" podCreationTimestamp="2025-12-16 16:11:10 +0000 UTC" firstStartedPulling="2025-12-16 16:11:11.520406913 +0000 UTC m=+4596.471485826" lastFinishedPulling="2025-12-16 16:11:17.12825381 +0000 UTC m=+4602.079332733" observedRunningTime="2025-12-16 16:11:17.613556906 +0000 UTC m=+4602.564635869" watchObservedRunningTime="2025-12-16 16:11:17.619747279 +0000 UTC m=+4602.570826202" Dec 16 16:11:20 crc kubenswrapper[4775]: I1216 16:11:20.727320 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mrlw2" Dec 16 16:11:20 crc kubenswrapper[4775]: I1216 16:11:20.727923 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mrlw2" Dec 16 16:11:22 crc kubenswrapper[4775]: I1216 16:11:22.232074 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mrlw2" podUID="4da666e0-aaed-4dfd-8ac8-f4007df2fbe4" containerName="registry-server" probeResult="failure" output=< Dec 16 16:11:22 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Dec 16 16:11:22 crc kubenswrapper[4775]: > Dec 16 16:11:26 crc kubenswrapper[4775]: I1216 16:11:26.338430 4775 scope.go:117] "RemoveContainer" containerID="d3e60c9c88a8eba7cfccff403f539430af46819167e101c59e7f400f53b3cfd7" Dec 16 16:11:26 crc kubenswrapper[4775]: E1216 16:11:26.339338 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1" Dec 16 16:11:30 crc kubenswrapper[4775]: I1216 16:11:30.790719 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mrlw2" Dec 16 16:11:30 crc kubenswrapper[4775]: I1216 16:11:30.850364 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mrlw2" Dec 16 16:11:31 crc kubenswrapper[4775]: I1216 16:11:31.028993 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mrlw2"] Dec 16 16:11:32 crc kubenswrapper[4775]: I1216 16:11:32.718858 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mrlw2" podUID="4da666e0-aaed-4dfd-8ac8-f4007df2fbe4" containerName="registry-server" containerID="cri-o://b567423689e38a822f3d807c70809d8820b506eb6a96aef857637ff0c4cf05e1" gracePeriod=2 Dec 16 16:11:33 crc kubenswrapper[4775]: I1216 16:11:33.178037 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrlw2" Dec 16 16:11:33 crc kubenswrapper[4775]: I1216 16:11:33.362506 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4-utilities\") pod \"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4\" (UID: \"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4\") " Dec 16 16:11:33 crc kubenswrapper[4775]: I1216 16:11:33.362791 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4-catalog-content\") pod \"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4\" (UID: \"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4\") " Dec 16 16:11:33 crc kubenswrapper[4775]: I1216 16:11:33.362849 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcpfr\" (UniqueName: \"kubernetes.io/projected/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4-kube-api-access-dcpfr\") pod \"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4\" (UID: \"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4\") " Dec 16 16:11:33 crc kubenswrapper[4775]: I1216 16:11:33.363534 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4-utilities" (OuterVolumeSpecName: "utilities") pod "4da666e0-aaed-4dfd-8ac8-f4007df2fbe4" (UID: "4da666e0-aaed-4dfd-8ac8-f4007df2fbe4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:11:33 crc kubenswrapper[4775]: I1216 16:11:33.371505 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4-kube-api-access-dcpfr" (OuterVolumeSpecName: "kube-api-access-dcpfr") pod "4da666e0-aaed-4dfd-8ac8-f4007df2fbe4" (UID: "4da666e0-aaed-4dfd-8ac8-f4007df2fbe4"). InnerVolumeSpecName "kube-api-access-dcpfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:11:33 crc kubenswrapper[4775]: I1216 16:11:33.464404 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcpfr\" (UniqueName: \"kubernetes.io/projected/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4-kube-api-access-dcpfr\") on node \"crc\" DevicePath \"\"" Dec 16 16:11:33 crc kubenswrapper[4775]: I1216 16:11:33.464435 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 16:11:33 crc kubenswrapper[4775]: I1216 16:11:33.484149 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4da666e0-aaed-4dfd-8ac8-f4007df2fbe4" (UID: "4da666e0-aaed-4dfd-8ac8-f4007df2fbe4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:11:33 crc kubenswrapper[4775]: I1216 16:11:33.566370 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 16:11:33 crc kubenswrapper[4775]: I1216 16:11:33.728513 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrlw2" Dec 16 16:11:33 crc kubenswrapper[4775]: I1216 16:11:33.728533 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrlw2" event={"ID":"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4","Type":"ContainerDied","Data":"b567423689e38a822f3d807c70809d8820b506eb6a96aef857637ff0c4cf05e1"} Dec 16 16:11:33 crc kubenswrapper[4775]: I1216 16:11:33.728580 4775 scope.go:117] "RemoveContainer" containerID="b567423689e38a822f3d807c70809d8820b506eb6a96aef857637ff0c4cf05e1" Dec 16 16:11:33 crc kubenswrapper[4775]: I1216 16:11:33.728452 4775 generic.go:334] "Generic (PLEG): container finished" podID="4da666e0-aaed-4dfd-8ac8-f4007df2fbe4" containerID="b567423689e38a822f3d807c70809d8820b506eb6a96aef857637ff0c4cf05e1" exitCode=0 Dec 16 16:11:33 crc kubenswrapper[4775]: I1216 16:11:33.728661 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrlw2" event={"ID":"4da666e0-aaed-4dfd-8ac8-f4007df2fbe4","Type":"ContainerDied","Data":"2915d4242c3bd39169e971eadfe5fc48018ec36ba2d73ce66ece4ed5ab78d1f0"} Dec 16 16:11:33 crc kubenswrapper[4775]: I1216 16:11:33.748991 4775 scope.go:117] "RemoveContainer" containerID="c90879384c4bac064dba4455271188152d6d8bcddb5b5e29cd89d9d7eeadbf29" Dec 16 16:11:33 crc kubenswrapper[4775]: I1216 16:11:33.766167 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mrlw2"] Dec 16 16:11:33 crc kubenswrapper[4775]: I1216 16:11:33.775416 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mrlw2"] Dec 16 16:11:34 crc kubenswrapper[4775]: I1216 16:11:34.233649 4775 scope.go:117] "RemoveContainer" containerID="6c0e128c999d6896a32568ad6bb9835a6a54b78e2ff506b3fa900312f58222bd" Dec 16 16:11:34 crc kubenswrapper[4775]: I1216 16:11:34.269181 4775 scope.go:117] "RemoveContainer" containerID="b567423689e38a822f3d807c70809d8820b506eb6a96aef857637ff0c4cf05e1" Dec 16 16:11:34 crc kubenswrapper[4775]: E1216 16:11:34.269792 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b567423689e38a822f3d807c70809d8820b506eb6a96aef857637ff0c4cf05e1\": container with ID starting with b567423689e38a822f3d807c70809d8820b506eb6a96aef857637ff0c4cf05e1 not found: ID does not exist" containerID="b567423689e38a822f3d807c70809d8820b506eb6a96aef857637ff0c4cf05e1" Dec 16 16:11:34 crc kubenswrapper[4775]: I1216 16:11:34.269857 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b567423689e38a822f3d807c70809d8820b506eb6a96aef857637ff0c4cf05e1"} err="failed to get container status \"b567423689e38a822f3d807c70809d8820b506eb6a96aef857637ff0c4cf05e1\": rpc error: code = NotFound desc = could not find container \"b567423689e38a822f3d807c70809d8820b506eb6a96aef857637ff0c4cf05e1\": container with ID starting with b567423689e38a822f3d807c70809d8820b506eb6a96aef857637ff0c4cf05e1 not found: ID does not exist" Dec 16 16:11:34 crc kubenswrapper[4775]: I1216 16:11:34.269907 4775 scope.go:117] "RemoveContainer" containerID="c90879384c4bac064dba4455271188152d6d8bcddb5b5e29cd89d9d7eeadbf29" Dec 16 16:11:34 crc kubenswrapper[4775]: E1216 16:11:34.270292 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c90879384c4bac064dba4455271188152d6d8bcddb5b5e29cd89d9d7eeadbf29\": container with ID starting with c90879384c4bac064dba4455271188152d6d8bcddb5b5e29cd89d9d7eeadbf29 not found: ID does not exist" containerID="c90879384c4bac064dba4455271188152d6d8bcddb5b5e29cd89d9d7eeadbf29" Dec 16 16:11:34 crc kubenswrapper[4775]: I1216 16:11:34.270354 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c90879384c4bac064dba4455271188152d6d8bcddb5b5e29cd89d9d7eeadbf29"} err="failed to get container status \"c90879384c4bac064dba4455271188152d6d8bcddb5b5e29cd89d9d7eeadbf29\": rpc error: code = NotFound desc = could not find container \"c90879384c4bac064dba4455271188152d6d8bcddb5b5e29cd89d9d7eeadbf29\": container with ID starting with c90879384c4bac064dba4455271188152d6d8bcddb5b5e29cd89d9d7eeadbf29 not found: ID does not exist" Dec 16 16:11:34 crc kubenswrapper[4775]: I1216 16:11:34.270381 4775 scope.go:117] "RemoveContainer" containerID="6c0e128c999d6896a32568ad6bb9835a6a54b78e2ff506b3fa900312f58222bd" Dec 16 16:11:34 crc kubenswrapper[4775]: E1216 16:11:34.270655 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c0e128c999d6896a32568ad6bb9835a6a54b78e2ff506b3fa900312f58222bd\": container with ID starting with 6c0e128c999d6896a32568ad6bb9835a6a54b78e2ff506b3fa900312f58222bd not found: ID does not exist" containerID="6c0e128c999d6896a32568ad6bb9835a6a54b78e2ff506b3fa900312f58222bd" Dec 16 16:11:34 crc kubenswrapper[4775]: I1216 16:11:34.270684 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c0e128c999d6896a32568ad6bb9835a6a54b78e2ff506b3fa900312f58222bd"} err="failed to get container status \"6c0e128c999d6896a32568ad6bb9835a6a54b78e2ff506b3fa900312f58222bd\": rpc error: code = NotFound desc = could not find container \"6c0e128c999d6896a32568ad6bb9835a6a54b78e2ff506b3fa900312f58222bd\": container with ID starting with 6c0e128c999d6896a32568ad6bb9835a6a54b78e2ff506b3fa900312f58222bd not found: ID does not exist" Dec 16 16:11:35 crc kubenswrapper[4775]: I1216 16:11:35.351011 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4da666e0-aaed-4dfd-8ac8-f4007df2fbe4" path="/var/lib/kubelet/pods/4da666e0-aaed-4dfd-8ac8-f4007df2fbe4/volumes" Dec 16 16:11:37 crc kubenswrapper[4775]: I1216 16:11:37.338438 4775 scope.go:117] "RemoveContainer" containerID="d3e60c9c88a8eba7cfccff403f539430af46819167e101c59e7f400f53b3cfd7" Dec 16 16:11:37 crc kubenswrapper[4775]: E1216 16:11:37.338740 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lh6xh_openshift-machine-config-operator(584613dc-ef95-4911-9a79-76e805e1d4d1)\"" pod="openshift-machine-config-operator/machine-config-daemon-lh6xh" podUID="584613dc-ef95-4911-9a79-76e805e1d4d1"